Oct 01 11:28:27 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 11:28:27 crc restorecon[4664]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:27 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 11:28:28 crc restorecon[4664]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 11:28:29 crc kubenswrapper[4669]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 11:28:29 crc kubenswrapper[4669]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 11:28:29 crc kubenswrapper[4669]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 11:28:29 crc kubenswrapper[4669]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 11:28:29 crc kubenswrapper[4669]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 11:28:29 crc kubenswrapper[4669]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.346379 4669 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352167 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352226 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352238 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352251 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352264 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352274 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352284 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352294 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352304 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352313 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352324 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352333 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352350 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352373 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352387 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352401 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352413 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352436 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352448 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352461 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352472 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352482 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352492 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352502 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352512 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352533 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352545 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352555 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352565 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352575 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352585 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352595 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352606 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352616 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352626 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352636 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352647 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352656 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352675 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352685 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352694 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352704 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352714 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352724 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352734 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352744 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352754 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352764 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352773 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352784 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352802 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352813 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352823 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352835 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352852 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352865 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352876 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352886 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352897 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352912 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352938 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352948 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352958 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352967 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352976 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352986 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.352995 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.353006 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.353015 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.353025 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.353034 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.354900 4669 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355238 4669 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355296 4669 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355309 4669 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355322 4669 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355332 4669 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355346 4669 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355358 4669 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355368 4669 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355378 4669 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355388 4669 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355399 4669 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355408 4669 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355418 4669 flags.go:64] FLAG: --cgroup-root="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355427 4669 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355435 4669 flags.go:64] FLAG: --client-ca-file="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355445 4669 flags.go:64] FLAG: --cloud-config="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355454 4669 flags.go:64] FLAG: --cloud-provider="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355463 4669 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355477 4669 flags.go:64] FLAG: --cluster-domain="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355487 4669 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355497 4669 flags.go:64] FLAG: --config-dir="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355508 4669 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355519 4669 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355552 4669 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355562 4669 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355572 4669 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355583 4669 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355592 4669 flags.go:64] FLAG: --contention-profiling="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355602 4669 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355611 4669 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355621 4669 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355630 4669 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355641 4669 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355652 4669 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355661 4669 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355670 4669 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355680 4669 flags.go:64] FLAG: --enable-server="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355689 4669 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355702 4669 flags.go:64] FLAG: --event-burst="100" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355712 4669 flags.go:64] FLAG: --event-qps="50" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355721 4669 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355730 4669 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355739 4669 flags.go:64] FLAG: --eviction-hard="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355751 4669 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355760 4669 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355769 4669 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355779 4669 flags.go:64] FLAG: --eviction-soft="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355787 4669 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355797 4669 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355806 4669 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355816 4669 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355824 4669 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355834 4669 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355843 4669 flags.go:64] FLAG: --feature-gates="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355854 4669 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355863 4669 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355873 4669 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355883 4669 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355892 4669 flags.go:64] FLAG: --healthz-port="10248" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355901 4669 flags.go:64] FLAG: --help="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355911 4669 flags.go:64] FLAG: --hostname-override="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355920 4669 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355929 4669 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355938 4669 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355948 4669 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355957 4669 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355966 4669 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355976 4669 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355986 4669 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.355994 4669 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356004 4669 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356015 4669 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356024 4669 flags.go:64] FLAG: --kube-reserved="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356033 4669 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356042 4669 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356050 4669 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356061 4669 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356069 4669 flags.go:64] FLAG: --lock-file="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356110 4669 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356119 4669 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356128 4669 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356141 4669 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356153 4669 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356162 4669 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356170 4669 flags.go:64] FLAG: --logging-format="text" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356179 4669 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356189 4669 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356197 4669 flags.go:64] FLAG: --manifest-url="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356207 4669 flags.go:64] FLAG: --manifest-url-header="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356219 4669 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356228 4669 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356240 4669 flags.go:64] FLAG: --max-pods="110" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356249 4669 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356259 4669 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356267 4669 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356276 4669 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356286 4669 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356295 4669 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356304 4669 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356330 4669 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356339 4669 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356348 4669 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356357 4669 flags.go:64] FLAG: --pod-cidr="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356366 4669 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356381 4669 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356390 4669 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356399 4669 flags.go:64] FLAG: --pods-per-core="0" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356407 4669 flags.go:64] FLAG: --port="10250" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356417 4669 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356426 4669 flags.go:64] FLAG: --provider-id="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356434 4669 flags.go:64] FLAG: --qos-reserved="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356443 4669 flags.go:64] FLAG: --read-only-port="10255" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356452 4669 flags.go:64] FLAG: --register-node="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356460 4669 flags.go:64] FLAG: --register-schedulable="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356469 4669 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356484 4669 flags.go:64] FLAG: --registry-burst="10" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356493 4669 flags.go:64] FLAG: --registry-qps="5" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356502 4669 flags.go:64] FLAG: --reserved-cpus="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356511 4669 flags.go:64] FLAG: --reserved-memory="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356523 4669 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356532 4669 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356541 4669 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356549 4669 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356558 4669 flags.go:64] FLAG: --runonce="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356567 4669 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356576 4669 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356585 4669 flags.go:64] FLAG: --seccomp-default="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356593 4669 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356602 4669 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356612 4669 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356621 4669 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356630 4669 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356639 4669 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356648 4669 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356656 4669 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356665 4669 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356674 4669 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356684 4669 flags.go:64] FLAG: --system-cgroups="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356693 4669 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356707 4669 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356716 4669 flags.go:64] FLAG: --tls-cert-file="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356724 4669 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356735 4669 flags.go:64] FLAG: --tls-min-version="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356744 4669 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356752 4669 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356761 4669 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356770 4669 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356778 4669 flags.go:64] FLAG: --v="2" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356790 4669 flags.go:64] FLAG: --version="false" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356803 4669 flags.go:64] FLAG: --vmodule="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356814 4669 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.356824 4669 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357062 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357072 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357110 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357119 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357129 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357137 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357145 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357153 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357161 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357172 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357181 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357190 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357198 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357206 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357214 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357222 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357229 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357237 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357245 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357253 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357261 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357269 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357277 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357284 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357293 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357301 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357309 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357317 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357325 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357333 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357341 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357349 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357356 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357367 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357376 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357386 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357394 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357402 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357413 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357422 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357430 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357438 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357446 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357454 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357462 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357473 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357483 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357491 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357500 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357507 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357515 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357526 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357535 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357545 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357553 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357561 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357570 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357579 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357587 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357595 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357606 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357616 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357625 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357633 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357641 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357649 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357658 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357665 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357673 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357680 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.357688 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.357714 4669 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.370905 4669 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.370958 4669 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371103 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371117 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371126 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371137 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371145 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371154 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371162 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371171 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371179 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371191 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371202 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371212 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371222 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371231 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371241 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371250 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371260 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371269 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371276 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371285 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371292 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371300 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371308 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371315 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371323 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371334 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371344 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371353 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371361 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371369 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371377 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371385 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371392 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371400 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371411 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371419 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371427 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371435 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371442 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371449 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371457 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371465 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371472 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371480 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371488 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371496 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371503 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371513 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371522 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371530 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371538 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371545 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371553 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371561 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371568 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371577 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371585 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371592 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371600 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371607 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371615 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371625 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371634 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371643 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371650 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371658 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371666 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371674 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371682 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371690 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371700 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.371713 4669 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371931 4669 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371943 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371952 4669 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371962 4669 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371973 4669 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371982 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371991 4669 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.371999 4669 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372008 4669 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372019 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372027 4669 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372035 4669 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372043 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372053 4669 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372063 4669 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372071 4669 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372103 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372112 4669 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372120 4669 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372128 4669 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372136 4669 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372143 4669 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372151 4669 feature_gate.go:330] unrecognized feature gate: Example Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372158 4669 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372166 4669 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372174 4669 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372181 4669 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372189 4669 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372196 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372203 4669 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372211 4669 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372219 4669 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372226 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372234 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372242 4669 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372253 4669 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372261 4669 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372269 4669 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372277 4669 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372284 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372292 4669 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372300 4669 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372308 4669 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372315 4669 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372322 4669 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372330 4669 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372338 4669 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372345 4669 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372353 4669 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372363 4669 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372372 4669 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372380 4669 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372388 4669 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372396 4669 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372403 4669 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372410 4669 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372418 4669 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372426 4669 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372433 4669 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372441 4669 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372448 4669 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372456 4669 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372463 4669 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372471 4669 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372479 4669 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372487 4669 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372495 4669 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372502 4669 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372510 4669 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372518 4669 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.372527 4669 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.372539 4669 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.372754 4669 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.380716 4669 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.380863 4669 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.382587 4669 server.go:997] "Starting client certificate rotation" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.382641 4669 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.384028 4669 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-16 18:05:59.816055952 +0000 UTC Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.384189 4669 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1830h37m30.431873118s for next certificate rotation Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.413222 4669 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.418385 4669 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.448367 4669 log.go:25] "Validated CRI v1 runtime API" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.498281 4669 log.go:25] "Validated CRI v1 image API" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.500641 4669 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.507267 4669 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-11-07-36-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.507354 4669 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.539430 4669 manager.go:217] Machine: {Timestamp:2025-10-01 11:28:29.535548351 +0000 UTC m=+0.635113398 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:117c455f-c374-48da-bb29-55b6929cd967 BootID:dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:09:55:ef Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:09:55:ef Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:67:67:49 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:26:e0:73 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5b:3b:8b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:74:65:3a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:8b:13:44:c3:c7 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:8e:b6:ec:97:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.539830 4669 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.540162 4669 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.542220 4669 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.542565 4669 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.542636 4669 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.542983 4669 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.543003 4669 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.543655 4669 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.543716 4669 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.543978 4669 state_mem.go:36] "Initialized new in-memory state store" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.544152 4669 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.552333 4669 kubelet.go:418] "Attempting to sync node with API server" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.552381 4669 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.552465 4669 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.552495 4669 kubelet.go:324] "Adding apiserver pod source" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.552518 4669 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.557256 4669 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.558566 4669 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.561613 4669 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563430 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563499 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563523 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563543 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563573 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563596 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563616 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563645 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563669 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.563531 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563690 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563763 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.563785 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.563574 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.563893 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.563895 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.565428 4669 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.566254 4669 server.go:1280] "Started kubelet" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.568184 4669 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 11:28:29 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.568728 4669 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.568965 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.570274 4669 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.572111 4669 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.572174 4669 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.572457 4669 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:12:28.087739002 +0000 UTC Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.572697 4669 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1540h43m58.515051174s for next certificate rotation Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.572801 4669 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.572955 4669 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.572979 4669 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.576550 4669 server.go:460] "Adding debug handlers to kubelet server" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.577005 4669 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.577870 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="200ms" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.577975 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.578070 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.580046 4669 factory.go:55] Registering systemd factory Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.580130 4669 factory.go:221] Registration of the systemd container factory successfully Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.584011 4669 factory.go:153] Registering CRI-O factory Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.584113 4669 factory.go:221] Registration of the crio container factory successfully Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.584260 4669 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.584306 4669 factory.go:103] Registering Raw factory Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.584336 4669 manager.go:1196] Started watching for new ooms in manager Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.585618 4669 manager.go:319] Starting recovery of all containers Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.587204 4669 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5a7fd79ae14d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 11:28:29.566198093 +0000 UTC m=+0.665763160,LastTimestamp:2025-10-01 11:28:29.566198093 +0000 UTC m=+0.665763160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597572 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597662 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597695 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597723 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597750 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597778 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597805 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597833 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597865 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597894 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597923 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597950 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.597978 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598013 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598040 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598115 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598151 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598183 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598211 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598242 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598269 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598296 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598324 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598349 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598374 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598404 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598437 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598465 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598495 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598521 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598599 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598627 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598659 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598687 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598711 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598737 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598766 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598791 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598819 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598846 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598884 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598913 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598938 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598966 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.598992 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.599020 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.599044 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.599072 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602527 4669 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602598 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602636 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602664 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602690 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602728 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602758 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602787 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602816 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602845 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602871 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602898 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602926 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602953 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.602982 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603006 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603033 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603060 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603142 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603174 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603202 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603229 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603260 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603287 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603313 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603339 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603364 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603393 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603419 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603446 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603473 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603498 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603528 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603556 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603581 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603607 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603632 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603656 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603682 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603707 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603733 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603757 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603783 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603808 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603835 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603859 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603884 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603912 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603937 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603963 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.603989 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604018 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604044 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604070 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604169 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604198 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604224 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604256 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604283 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604311 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604345 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604373 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604407 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604436 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604466 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604491 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604515 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604542 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604663 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604689 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604708 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604727 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604745 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604763 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604782 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604800 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604817 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604835 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604855 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604873 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604893 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604911 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604928 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604946 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604964 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.604983 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605004 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605023 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605043 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605062 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605112 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605130 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605155 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605173 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605191 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605209 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605232 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605251 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605269 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605287 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605306 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605324 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605341 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605360 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605377 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605398 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605416 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605433 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605451 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605470 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605489 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605507 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605527 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605544 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605562 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605582 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605599 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605617 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605635 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605653 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605673 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605693 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605713 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605732 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605754 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605774 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605794 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605813 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605833 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605852 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605870 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605888 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605907 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605925 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605943 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605963 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.605982 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606001 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606018 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606036 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606054 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606096 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606116 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606133 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606153 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606171 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606189 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606209 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606228 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606246 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606265 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606284 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606304 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606339 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606358 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606376 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606395 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606414 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606433 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606454 4669 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606478 4669 reconstruct.go:97] "Volume reconstruction finished" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.606494 4669 reconciler.go:26] "Reconciler: start to sync state" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.617301 4669 manager.go:324] Recovery completed Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.637592 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.639765 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.639826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.639839 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.640557 4669 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.641213 4669 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.641235 4669 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.641258 4669 state_mem.go:36] "Initialized new in-memory state store" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.642759 4669 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.642805 4669 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.642837 4669 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.642881 4669 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 11:28:29 crc kubenswrapper[4669]: W1001 11:28:29.643769 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.643887 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.668656 4669 policy_none.go:49] "None policy: Start" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.669499 4669 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.669556 4669 state_mem.go:35] "Initializing new in-memory state store" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.673832 4669 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.723318 4669 manager.go:334] "Starting Device Plugin manager" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.723418 4669 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.723452 4669 server.go:79] "Starting device plugin registration server" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.724335 4669 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.724375 4669 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.724716 4669 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.724884 4669 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.724916 4669 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.734016 4669 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.743233 4669 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.743338 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.748520 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.748555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.748565 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.748789 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.748983 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.749068 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.752912 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.752965 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.752926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.753013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.752980 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.753032 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.753288 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.753480 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.753586 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.754190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.754247 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.754268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.754427 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.754582 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.754613 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.755227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.755282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.755299 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.755603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.755631 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.755657 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.756009 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.756065 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.756114 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.756365 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.756461 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.756492 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757385 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757423 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757486 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757806 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.757861 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.759052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.759111 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.759125 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.778887 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="400ms" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.809612 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.809676 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.809716 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.809751 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.809812 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.809965 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810034 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810068 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810142 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810246 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810351 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810444 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810493 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810533 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.810602 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.825049 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.827161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.827238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.827258 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.827305 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:29 crc kubenswrapper[4669]: E1001 11:28:29.828011 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912010 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912135 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912178 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912211 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912245 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912278 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912308 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912302 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912339 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912402 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912435 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912465 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912486 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912496 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912561 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912565 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912597 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912650 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912701 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912749 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912860 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912953 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.912994 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.913039 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.913120 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.913212 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.913268 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:29 crc kubenswrapper[4669]: I1001 11:28:29.913337 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.028977 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.030565 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.030613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.030631 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.030668 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.031138 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.081128 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.087041 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.102055 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.126249 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.135509 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.137385 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ee9ad6de10b8292ece5ddabccb8e7cd3afa75ad373f158794ae77d99ee10571c WatchSource:0}: Error finding container ee9ad6de10b8292ece5ddabccb8e7cd3afa75ad373f158794ae77d99ee10571c: Status 404 returned error can't find the container with id ee9ad6de10b8292ece5ddabccb8e7cd3afa75ad373f158794ae77d99ee10571c Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.143724 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f365f5afad2d7b88640430ec12ac1af970ef73a5c9e4c434cf7773e699019bff WatchSource:0}: Error finding container f365f5afad2d7b88640430ec12ac1af970ef73a5c9e4c434cf7773e699019bff: Status 404 returned error can't find the container with id f365f5afad2d7b88640430ec12ac1af970ef73a5c9e4c434cf7773e699019bff Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.151940 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-bbd91b5b355c909e45e3ed82e8468afff32f411bfd7fbf0ac130740969e07dee WatchSource:0}: Error finding container bbd91b5b355c909e45e3ed82e8468afff32f411bfd7fbf0ac130740969e07dee: Status 404 returned error can't find the container with id bbd91b5b355c909e45e3ed82e8468afff32f411bfd7fbf0ac130740969e07dee Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.155359 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-fb812b64b0912a8747c98dfe5d1ed071d557026cf08e9c84af1c25cd958d42be WatchSource:0}: Error finding container fb812b64b0912a8747c98dfe5d1ed071d557026cf08e9c84af1c25cd958d42be: Status 404 returned error can't find the container with id fb812b64b0912a8747c98dfe5d1ed071d557026cf08e9c84af1c25cd958d42be Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.158139 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-520e4bfed0a8a48a2ca0961ca32e3f853a30bafed602b1bfd379ed359f32a797 WatchSource:0}: Error finding container 520e4bfed0a8a48a2ca0961ca32e3f853a30bafed602b1bfd379ed359f32a797: Status 404 returned error can't find the container with id 520e4bfed0a8a48a2ca0961ca32e3f853a30bafed602b1bfd379ed359f32a797 Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.180241 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="800ms" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.431697 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.433737 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.433808 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.433828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.433871 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.434466 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.548540 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.548682 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.570382 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.624312 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.624441 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.635630 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.635726 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.649770 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bbd91b5b355c909e45e3ed82e8468afff32f411bfd7fbf0ac130740969e07dee"} Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.651891 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f365f5afad2d7b88640430ec12ac1af970ef73a5c9e4c434cf7773e699019bff"} Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.653641 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee9ad6de10b8292ece5ddabccb8e7cd3afa75ad373f158794ae77d99ee10571c"} Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.655619 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"520e4bfed0a8a48a2ca0961ca32e3f853a30bafed602b1bfd379ed359f32a797"} Oct 01 11:28:30 crc kubenswrapper[4669]: I1001 11:28:30.657687 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fb812b64b0912a8747c98dfe5d1ed071d557026cf08e9c84af1c25cd958d42be"} Oct 01 11:28:30 crc kubenswrapper[4669]: W1001 11:28:30.914055 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.914669 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:30 crc kubenswrapper[4669]: E1001 11:28:30.980868 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="1.6s" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.235342 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.237467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.237580 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.237601 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.237636 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:31 crc kubenswrapper[4669]: E1001 11:28:31.238355 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.570365 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.664654 4669 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7" exitCode=0 Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.664813 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.664934 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.666339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.666386 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.666405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.668369 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.668444 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.668472 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.670950 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27" exitCode=0 Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.671023 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.671172 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.672843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.672905 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.672933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.674690 4669 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d" exitCode=0 Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.674817 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.674916 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.676868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.676937 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.676957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.677366 4669 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bb88678d7ca6bbb14f7cb6061506f3827bcb8a487c7e93b36d3964b1a7ff359b" exitCode=0 Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.677448 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.677440 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bb88678d7ca6bbb14f7cb6061506f3827bcb8a487c7e93b36d3964b1a7ff359b"} Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.678490 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.678711 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.678736 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.678747 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.679880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.679936 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:31 crc kubenswrapper[4669]: I1001 11:28:31.679956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:32 crc kubenswrapper[4669]: W1001 11:28:32.529293 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:32 crc kubenswrapper[4669]: E1001 11:28:32.529413 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.570632 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:32 crc kubenswrapper[4669]: E1001 11:28:32.582598 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="3.2s" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.683744 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.683782 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.685143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.685208 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.685221 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.687860 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.687918 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.687931 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.689866 4669 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8" exitCode=0 Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.689950 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.690028 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.691201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.691229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.691238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.692286 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"df6b37053905d2d478530b3bbf178e2108133f8341b333654552f0fe41f1ad4a"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.692379 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.693793 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.693815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.693824 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.701798 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.701853 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.701864 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8"} Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.701929 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.703053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.703096 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.703109 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.838499 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.840751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.840797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.840807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:32 crc kubenswrapper[4669]: I1001 11:28:32.840839 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:32 crc kubenswrapper[4669]: E1001 11:28:32.841369 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Oct 01 11:28:33 crc kubenswrapper[4669]: W1001 11:28:33.425258 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Oct 01 11:28:33 crc kubenswrapper[4669]: E1001 11:28:33.425371 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.432740 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.707641 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.710573 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd" exitCode=255 Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.710676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd"} Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.710781 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.710814 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b"} Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.712241 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.712286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.712304 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.712987 4669 scope.go:117] "RemoveContainer" containerID="fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.714576 4669 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4" exitCode=0 Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.714756 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.714779 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.714832 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.715209 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4"} Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.714769 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.715375 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.716582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.716625 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.716644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.716759 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.716795 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.716814 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.717915 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.717978 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.718001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.717927 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.718069 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.718118 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:33 crc kubenswrapper[4669]: I1001 11:28:33.944154 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.112047 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.720020 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.722830 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4"} Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.723023 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.723267 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.724797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.724854 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.724875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.728793 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d"} Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.728864 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242"} Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.728882 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.728892 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e"} Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.728934 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.730492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.730553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.730579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.730712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.730786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:34 crc kubenswrapper[4669]: I1001 11:28:34.730813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.739187 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.739409 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68"} Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.739504 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.739718 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1"} Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.739849 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.741347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.741392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.741406 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.743165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.743223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:35 crc kubenswrapper[4669]: I1001 11:28:35.743244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.042377 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.044328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.044386 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.044404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.044439 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.743061 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.743300 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.747477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.747524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.747571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.747593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.747573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:36 crc kubenswrapper[4669]: I1001 11:28:36.747729 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.066622 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.366123 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.366420 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.368305 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.368366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.368379 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.377030 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.745999 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.745999 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.748908 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.748971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.748989 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.748916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.749068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:37 crc kubenswrapper[4669]: I1001 11:28:37.749113 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:39 crc kubenswrapper[4669]: E1001 11:28:39.734222 4669 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 11:28:39 crc kubenswrapper[4669]: I1001 11:28:39.804129 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:39 crc kubenswrapper[4669]: I1001 11:28:39.804420 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:39 crc kubenswrapper[4669]: I1001 11:28:39.806161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:39 crc kubenswrapper[4669]: I1001 11:28:39.806238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:39 crc kubenswrapper[4669]: I1001 11:28:39.806266 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.827366 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.827637 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.829172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.829230 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.829258 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.918677 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.918923 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.920834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.920889 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.920910 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:41 crc kubenswrapper[4669]: I1001 11:28:41.926183 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:42 crc kubenswrapper[4669]: I1001 11:28:42.761120 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:42 crc kubenswrapper[4669]: I1001 11:28:42.762456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:42 crc kubenswrapper[4669]: I1001 11:28:42.762508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:42 crc kubenswrapper[4669]: I1001 11:28:42.762527 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:42 crc kubenswrapper[4669]: I1001 11:28:42.804744 4669 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 11:28:42 crc kubenswrapper[4669]: I1001 11:28:42.804834 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 11:28:43 crc kubenswrapper[4669]: W1001 11:28:43.516311 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 11:28:43 crc kubenswrapper[4669]: I1001 11:28:43.516486 4669 trace.go:236] Trace[370677413]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 11:28:33.514) (total time: 10001ms): Oct 01 11:28:43 crc kubenswrapper[4669]: Trace[370677413]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:28:43.516) Oct 01 11:28:43 crc kubenswrapper[4669]: Trace[370677413]: [10.001492858s] [10.001492858s] END Oct 01 11:28:43 crc kubenswrapper[4669]: E1001 11:28:43.516516 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 11:28:43 crc kubenswrapper[4669]: I1001 11:28:43.571362 4669 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 01 11:28:43 crc kubenswrapper[4669]: W1001 11:28:43.679185 4669 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 11:28:43 crc kubenswrapper[4669]: I1001 11:28:43.679347 4669 trace.go:236] Trace[1084661552]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 11:28:33.677) (total time: 10002ms): Oct 01 11:28:43 crc kubenswrapper[4669]: Trace[1084661552]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:28:43.679) Oct 01 11:28:43 crc kubenswrapper[4669]: Trace[1084661552]: [10.002096825s] [10.002096825s] END Oct 01 11:28:43 crc kubenswrapper[4669]: E1001 11:28:43.679384 4669 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 11:28:44 crc kubenswrapper[4669]: I1001 11:28:44.112539 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="" start-of-body= Oct 01 11:28:44 crc kubenswrapper[4669]: I1001 11:28:44.122989 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 11:28:44 crc kubenswrapper[4669]: I1001 11:28:44.123072 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 11:28:44 crc kubenswrapper[4669]: I1001 11:28:44.127118 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 11:28:44 crc kubenswrapper[4669]: I1001 11:28:44.127194 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 11:28:46 crc kubenswrapper[4669]: I1001 11:28:46.914292 4669 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 11:28:48 crc kubenswrapper[4669]: I1001 11:28:48.204726 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 11:28:48 crc kubenswrapper[4669]: I1001 11:28:48.204863 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.119994 4669 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.120047 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.120282 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.120723 4669 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.120809 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.121354 4669 trace.go:236] Trace[1504967574]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 11:28:39.024) (total time: 10096ms): Oct 01 11:28:49 crc kubenswrapper[4669]: Trace[1504967574]: ---"Objects listed" error: 10096ms (11:28:49.121) Oct 01 11:28:49 crc kubenswrapper[4669]: Trace[1504967574]: [10.096917024s] [10.096917024s] END Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.121378 4669 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.121675 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.121767 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.121839 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.126199 4669 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.126783 4669 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.126902 4669 trace.go:236] Trace[1244663584]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 11:28:38.507) (total time: 10619ms): Oct 01 11:28:49 crc kubenswrapper[4669]: Trace[1244663584]: ---"Objects listed" error: 10619ms (11:28:49.126) Oct 01 11:28:49 crc kubenswrapper[4669]: Trace[1244663584]: [10.619469324s] [10.619469324s] END Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.126926 4669 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.127390 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.217747 4669 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.566167 4669 apiserver.go:52] "Watching apiserver" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.581940 4669 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.582538 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.582948 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.583032 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.583112 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.583363 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.583449 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.583530 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.583593 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.583869 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.583942 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.585554 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.585639 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.586401 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.586456 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.587327 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.587554 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.591561 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.591829 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.591866 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.677965 4669 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.688348 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.707501 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.720103 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731158 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731377 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731447 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731620 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731702 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731764 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731816 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731879 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731943 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.732001 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.732071 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.732185 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.732241 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.732303 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.731888 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.732422 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733400 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733500 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733534 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733558 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733583 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733601 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733619 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733643 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733670 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733759 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733784 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733809 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733826 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733843 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733858 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733874 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733890 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733908 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733926 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733972 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.733997 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734013 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734030 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734099 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734117 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734156 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734171 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734189 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734204 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734224 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734239 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734170 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.734254 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.735900 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736003 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736124 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736180 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736223 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736272 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736327 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736366 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736422 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736479 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736533 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736602 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736656 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736777 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736710 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736926 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.736997 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737018 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737069 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737283 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737392 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737494 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737596 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737651 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737716 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737741 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737773 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737829 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737879 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737930 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737984 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738043 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738147 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738267 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.737971 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738200 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738851 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738909 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.738984 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739041 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739089 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739102 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739127 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739187 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739253 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739315 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739373 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739453 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739508 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739561 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739630 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739684 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739752 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739813 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739875 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739925 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742345 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742458 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742572 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742738 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742865 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743032 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743357 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745194 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745247 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745286 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745321 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745348 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745384 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745415 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745449 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745475 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745507 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745535 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745562 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745590 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745618 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745645 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745674 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745707 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745735 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745761 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745853 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745885 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745909 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745935 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745987 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746017 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746044 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746098 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746196 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746228 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746253 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746279 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746325 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746351 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746401 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746428 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746458 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746487 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746525 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746555 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746596 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746622 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746646 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746674 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746700 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746781 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746808 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746838 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746866 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746894 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746921 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746945 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746972 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747001 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747032 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747059 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747106 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747134 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747158 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747182 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747205 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747229 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747256 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747280 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747312 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747336 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747358 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747382 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747448 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747477 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747501 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747525 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747549 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747572 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747593 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747622 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747650 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747673 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747700 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747725 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747751 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747776 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747800 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747824 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747853 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747878 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747905 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747935 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747959 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747984 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748012 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748044 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748266 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748299 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748399 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748441 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748472 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748506 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748543 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748579 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748610 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748642 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748681 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748712 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748737 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748787 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748837 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748949 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748969 4669 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748985 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749002 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749020 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749035 4669 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749049 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749064 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749099 4669 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749114 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745221 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.750449 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739309 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.755234 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739729 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739886 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.739838 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.740032 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.740346 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.741027 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.741429 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.741489 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742119 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.742737 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743375 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743365 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743614 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743652 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.743780 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744140 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744154 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744470 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744480 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744588 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744814 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745042 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745065 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745116 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745265 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745402 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.744534 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.745814 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746157 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.746942 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747010 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747036 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747157 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747813 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.747871 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748301 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.748987 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749278 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749741 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.749827 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.750030 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.765723 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.765834 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.766178 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.766407 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.766570 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.766327 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.766962 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.767387 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.767363 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.767456 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.769457 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.769509 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.770198 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.770465 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.770479 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.755789 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.770507 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.771125 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.771235 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.768194 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.771425 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.771608 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.771738 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.771898 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:28:50.271843192 +0000 UTC m=+21.371408219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.772126 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.772113 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.771549 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.772694 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.772770 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.773097 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.773398 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.773492 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.773933 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.774023 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.774350 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.774514 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.774574 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.775177 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.775191 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.775503 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.777294 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.777885 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.777930 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.778292 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.778628 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.778715 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.778730 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779012 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779183 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779348 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779584 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779794 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.779861 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.780522 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.780659 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.780701 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.780976 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.782300 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.782356 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.781328 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.782775 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.782806 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.783314 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.783768 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.783891 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.783943 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.784043 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.784572 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.780237 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.783533 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.785000 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.783781 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.780319 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.785293 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.785437 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.785616 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.785371 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.786483 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.786545 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.787827 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.787812 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.787888 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.788149 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.787993 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.787371 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.788427 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.788625 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.788702 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:50.288653365 +0000 UTC m=+21.388218392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.789829 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.790122 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.790477 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.791437 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.791572 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.791995 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.792262 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.792731 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.793355 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.797179 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.798126 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.798388 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.799408 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.800548 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.800597 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.801229 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.801501 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.801816 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.802017 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.801931 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.802538 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.803317 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.803303 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.803948 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.804525 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.804679 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.806254 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.806691 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.807355 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.808695 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.809505 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.809526 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.809919 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.810356 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.810653 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.810673 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.810786 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.810838 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.810892 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.811070 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.811366 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.811644 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.811524 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.811418 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.811695 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.811910 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:50.311883695 +0000 UTC m=+21.411448682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.812107 4669 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.813217 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.815605 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.820955 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.827614 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.837433 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.839431 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.839779 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.839812 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.839829 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.839938 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:50.339901313 +0000 UTC m=+21.439466290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.840222 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.842438 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4" exitCode=255 Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.842505 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4"} Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.842581 4669 scope.go:117] "RemoveContainer" containerID="fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.845018 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.848252 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.848795 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.849455 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.849513 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.849526 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.849593 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.849615 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.849720 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:50.349686404 +0000 UTC m=+21.449251561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850174 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850257 4669 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850282 4669 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850298 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850316 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850330 4669 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850344 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850358 4669 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850376 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850391 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850406 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850420 4669 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850431 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850447 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850458 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850479 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850493 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850509 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850542 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850552 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850564 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850578 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850589 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850601 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850613 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850623 4669 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850632 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850642 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850651 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850673 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850683 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850693 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850720 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850730 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850740 4669 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850749 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850759 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850768 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850778 4669 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850788 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850797 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850806 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850815 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850824 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850835 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850847 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850856 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850865 4669 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850877 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850885 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850893 4669 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850902 4669 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850915 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850925 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850935 4669 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850944 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850971 4669 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850980 4669 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850990 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.850998 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851039 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851050 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851061 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851070 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851095 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851107 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851118 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851127 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851135 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851144 4669 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851154 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851162 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851174 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851184 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851194 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851202 4669 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851212 4669 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851232 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851240 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851249 4669 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851258 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851269 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851278 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851287 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851296 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851305 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851315 4669 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851324 4669 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851332 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851343 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851353 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851364 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851404 4669 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851417 4669 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851426 4669 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851435 4669 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851446 4669 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851459 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851470 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851479 4669 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851488 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851497 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851506 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851514 4669 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851524 4669 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851533 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851542 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851552 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851561 4669 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851570 4669 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851579 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851589 4669 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851598 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851606 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851615 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851624 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851634 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851649 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851659 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851669 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851677 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851686 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851695 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851704 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851712 4669 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851721 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851732 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851741 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851751 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851760 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851769 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854171 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854198 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854207 4669 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854217 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854235 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854244 4669 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854252 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854261 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854270 4669 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854279 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854298 4669 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854307 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854316 4669 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854324 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854333 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854341 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854348 4669 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854359 4669 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854372 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854384 4669 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854397 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854409 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854444 4669 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854456 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854469 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854481 4669 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854494 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854508 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854521 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854533 4669 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854547 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854558 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854570 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854581 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854639 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854650 4669 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854658 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854677 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854687 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854698 4669 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854707 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854716 4669 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854726 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854736 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.854746 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851248 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851353 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.851158 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.870517 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.874502 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.875356 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.875453 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.875707 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.875779 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.876430 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.876624 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.878099 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.881329 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.899360 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.900361 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.901031 4669 scope.go:117] "RemoveContainer" containerID="79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4" Oct 01 11:28:49 crc kubenswrapper[4669]: E1001 11:28:49.901223 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.901034 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.901400 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.901738 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.905976 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.910393 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.912573 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.915162 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.918717 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.929069 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: W1001 11:28:49.930231 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-50246c3c8a4503a98aee8db8ede6347a59af638e2b122267b9888471a3c3cc9a WatchSource:0}: Error finding container 50246c3c8a4503a98aee8db8ede6347a59af638e2b122267b9888471a3c3cc9a: Status 404 returned error can't find the container with id 50246c3c8a4503a98aee8db8ede6347a59af638e2b122267b9888471a3c3cc9a Oct 01 11:28:49 crc kubenswrapper[4669]: W1001 11:28:49.935683 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7376449bdde8d3ef39fd6093cbc4b7fb79b83c02df62c69297f29e7e91cd0891 WatchSource:0}: Error finding container 7376449bdde8d3ef39fd6093cbc4b7fb79b83c02df62c69297f29e7e91cd0891: Status 404 returned error can't find the container with id 7376449bdde8d3ef39fd6093cbc4b7fb79b83c02df62c69297f29e7e91cd0891 Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.942454 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956139 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956175 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956185 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956196 4669 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956209 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956218 4669 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956227 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.956238 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.964475 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"message\\\":\\\"W1001 11:28:32.976655 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 11:28:32.977004 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759318112 cert, and key in /tmp/serving-cert-2803551764/serving-signer.crt, /tmp/serving-cert-2803551764/serving-signer.key\\\\nI1001 11:28:33.210414 1 observer_polling.go:159] Starting file observer\\\\nW1001 11:28:33.216725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 11:28:33.217112 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:33.222626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2803551764/tls.crt::/tmp/serving-cert-2803551764/tls.key\\\\\\\"\\\\nF1001 11:28:33.405042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:49 crc kubenswrapper[4669]: I1001 11:28:49.982968 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:49.999944 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.018600 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.031905 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.046787 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.057408 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.070256 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.082117 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.094960 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.110334 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"message\\\":\\\"W1001 11:28:32.976655 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 11:28:32.977004 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759318112 cert, and key in /tmp/serving-cert-2803551764/serving-signer.crt, /tmp/serving-cert-2803551764/serving-signer.key\\\\nI1001 11:28:33.210414 1 observer_polling.go:159] Starting file observer\\\\nW1001 11:28:33.216725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 11:28:33.217112 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:33.222626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2803551764/tls.crt::/tmp/serving-cert-2803551764/tls.key\\\\\\\"\\\\nF1001 11:28:33.405042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.125807 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.142526 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.203336 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zmmr7"] Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.203674 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.206330 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.206449 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.209886 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.222326 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"message\\\":\\\"W1001 11:28:32.976655 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 11:28:32.977004 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759318112 cert, and key in /tmp/serving-cert-2803551764/serving-signer.crt, /tmp/serving-cert-2803551764/serving-signer.key\\\\nI1001 11:28:33.210414 1 observer_polling.go:159] Starting file observer\\\\nW1001 11:28:33.216725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 11:28:33.217112 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:33.222626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2803551764/tls.crt::/tmp/serving-cert-2803551764/tls.key\\\\\\\"\\\\nF1001 11:28:33.405042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.240136 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.252919 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.263774 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.279536 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.305240 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.330349 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358517 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358594 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358622 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzkl\" (UniqueName: \"kubernetes.io/projected/5f297cf4-7106-4aee-af55-e0a404e56b39-kube-api-access-mlzkl\") pod \"node-resolver-zmmr7\" (UID: \"5f297cf4-7106-4aee-af55-e0a404e56b39\") " pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358649 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358665 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358685 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.358707 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f297cf4-7106-4aee-af55-e0a404e56b39-hosts-file\") pod \"node-resolver-zmmr7\" (UID: \"5f297cf4-7106-4aee-af55-e0a404e56b39\") " pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358761 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358801 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:28:51.358734404 +0000 UTC m=+22.458299371 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358822 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358832 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358844 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358848 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358861 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358873 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358875 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:51.358848387 +0000 UTC m=+22.458413554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358887 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358902 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:51.358887788 +0000 UTC m=+22.458452755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358919 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:51.358913238 +0000 UTC m=+22.458478215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.358933 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:51.358926889 +0000 UTC m=+22.458491866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.384225 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.402644 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.459574 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f297cf4-7106-4aee-af55-e0a404e56b39-hosts-file\") pod \"node-resolver-zmmr7\" (UID: \"5f297cf4-7106-4aee-af55-e0a404e56b39\") " pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.459628 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzkl\" (UniqueName: \"kubernetes.io/projected/5f297cf4-7106-4aee-af55-e0a404e56b39-kube-api-access-mlzkl\") pod \"node-resolver-zmmr7\" (UID: \"5f297cf4-7106-4aee-af55-e0a404e56b39\") " pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.459748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f297cf4-7106-4aee-af55-e0a404e56b39-hosts-file\") pod \"node-resolver-zmmr7\" (UID: \"5f297cf4-7106-4aee-af55-e0a404e56b39\") " pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.474918 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzkl\" (UniqueName: \"kubernetes.io/projected/5f297cf4-7106-4aee-af55-e0a404e56b39-kube-api-access-mlzkl\") pod \"node-resolver-zmmr7\" (UID: \"5f297cf4-7106-4aee-af55-e0a404e56b39\") " pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.517170 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zmmr7" Oct 01 11:28:50 crc kubenswrapper[4669]: W1001 11:28:50.529095 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f297cf4_7106_4aee_af55_e0a404e56b39.slice/crio-b44dc093ae4a78c38f2e00e31fe450cfc3c07a391cb6b29828d7b808d899020e WatchSource:0}: Error finding container b44dc093ae4a78c38f2e00e31fe450cfc3c07a391cb6b29828d7b808d899020e: Status 404 returned error can't find the container with id b44dc093ae4a78c38f2e00e31fe450cfc3c07a391cb6b29828d7b808d899020e Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.643264 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.643410 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.852029 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7376449bdde8d3ef39fd6093cbc4b7fb79b83c02df62c69297f29e7e91cd0891"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.859291 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.859347 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.859370 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"50246c3c8a4503a98aee8db8ede6347a59af638e2b122267b9888471a3c3cc9a"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.861199 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.864264 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zmmr7" event={"ID":"5f297cf4-7106-4aee-af55-e0a404e56b39","Type":"ContainerStarted","Data":"d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.864339 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zmmr7" event={"ID":"5f297cf4-7106-4aee-af55-e0a404e56b39","Type":"ContainerStarted","Data":"b44dc093ae4a78c38f2e00e31fe450cfc3c07a391cb6b29828d7b808d899020e"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.865795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469"} Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.865838 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e783b42bfe0e34930a681519c09424cf25e3b7788c6a9987913725db5318768b"} Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.875993 4669 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.876355 4669 scope.go:117] "RemoveContainer" containerID="79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4" Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.876596 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 11:28:50 crc kubenswrapper[4669]: E1001 11:28:50.877240 4669 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.878717 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:50Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.901000 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbb6d7d10ded0414bfb4997669e5c54851b0f8bd371986dd26b8cddb4389ebbd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"message\\\":\\\"W1001 11:28:32.976655 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 11:28:32.977004 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759318112 cert, and key in /tmp/serving-cert-2803551764/serving-signer.crt, /tmp/serving-cert-2803551764/serving-signer.key\\\\nI1001 11:28:33.210414 1 observer_polling.go:159] Starting file observer\\\\nW1001 11:28:33.216725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 11:28:33.217112 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:33.222626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2803551764/tls.crt::/tmp/serving-cert-2803551764/tls.key\\\\\\\"\\\\nF1001 11:28:33.405042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:50Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.932125 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:50Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.960063 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:50Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.991547 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:50Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.998494 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5rfqz"] Oct 01 11:28:50 crc kubenswrapper[4669]: I1001 11:28:50.998966 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.000131 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-fsthv"] Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.000752 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.001595 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.001730 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.001790 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.002736 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.003510 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.003799 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.004661 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.004791 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.004938 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.005401 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.023227 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.042668 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.057921 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.081151 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.099934 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.121015 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.139575 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.153245 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163605 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-rootfs\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163645 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6069cadd-c466-42b0-a195-f2b2537f17b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163669 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-os-release\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163741 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k997j\" (UniqueName: \"kubernetes.io/projected/6069cadd-c466-42b0-a195-f2b2537f17b6-kube-api-access-k997j\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163769 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6wn\" (UniqueName: \"kubernetes.io/projected/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-kube-api-access-ns6wn\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163806 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-system-cni-dir\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163850 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6069cadd-c466-42b0-a195-f2b2537f17b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163869 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163886 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-mcd-auth-proxy-config\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.163947 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-proxy-tls\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.164045 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-cnibin\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.164985 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.177361 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.190939 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.203578 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.215641 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.237491 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.252355 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264591 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6wn\" (UniqueName: \"kubernetes.io/projected/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-kube-api-access-ns6wn\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264636 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-system-cni-dir\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6069cadd-c466-42b0-a195-f2b2537f17b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264682 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264712 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-proxy-tls\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264737 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-mcd-auth-proxy-config\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264772 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-cnibin\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264799 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-rootfs\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264820 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6069cadd-c466-42b0-a195-f2b2537f17b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264840 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k997j\" (UniqueName: \"kubernetes.io/projected/6069cadd-c466-42b0-a195-f2b2537f17b6-kube-api-access-k997j\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264834 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-system-cni-dir\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.264857 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-os-release\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.265024 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-os-release\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.265068 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-cnibin\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.265716 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-mcd-auth-proxy-config\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.265781 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6069cadd-c466-42b0-a195-f2b2537f17b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.265848 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-rootfs\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.266007 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6069cadd-c466-42b0-a195-f2b2537f17b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.266293 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6069cadd-c466-42b0-a195-f2b2537f17b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.277625 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-proxy-tls\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.290120 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6wn\" (UniqueName: \"kubernetes.io/projected/a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2-kube-api-access-ns6wn\") pod \"machine-config-daemon-5rfqz\" (UID: \"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\") " pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.293244 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k997j\" (UniqueName: \"kubernetes.io/projected/6069cadd-c466-42b0-a195-f2b2537f17b6-kube-api-access-k997j\") pod \"multus-additional-cni-plugins-fsthv\" (UID: \"6069cadd-c466-42b0-a195-f2b2537f17b6\") " pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.311638 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.317023 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fsthv" Oct 01 11:28:51 crc kubenswrapper[4669]: W1001 11:28:51.329175 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a0a9d6_edb9_49ce_aa22_cdac1d6a49b2.slice/crio-daaa6a650f263e0b8fa1e24e0952648014bbc2df920971ea0584fda44e7b4453 WatchSource:0}: Error finding container daaa6a650f263e0b8fa1e24e0952648014bbc2df920971ea0584fda44e7b4453: Status 404 returned error can't find the container with id daaa6a650f263e0b8fa1e24e0952648014bbc2df920971ea0584fda44e7b4453 Oct 01 11:28:51 crc kubenswrapper[4669]: W1001 11:28:51.330557 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6069cadd_c466_42b0_a195_f2b2537f17b6.slice/crio-1bed6bbc9e89cb9e07e05b53b390c0effb239e339e599b682f2125407687d5d6 WatchSource:0}: Error finding container 1bed6bbc9e89cb9e07e05b53b390c0effb239e339e599b682f2125407687d5d6: Status 404 returned error can't find the container with id 1bed6bbc9e89cb9e07e05b53b390c0effb239e339e599b682f2125407687d5d6 Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.366643 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.366756 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.366793 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.366816 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.366835 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.366965 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.366982 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.366994 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367045 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:53.367032243 +0000 UTC m=+24.466597220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367376 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367475 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367473 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367507 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367523 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367494 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:28:53.367449293 +0000 UTC m=+24.467014270 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367610 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:53.367596976 +0000 UTC m=+24.467161953 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367625 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:53.367619357 +0000 UTC m=+24.467184324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.367638 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:53.367631217 +0000 UTC m=+24.467196194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.382938 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8kl5"] Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.383834 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9kgdm"] Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.384093 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.384099 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.389329 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.389558 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.389674 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.389742 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.389871 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.390019 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.390089 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.390323 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.392630 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.405990 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.422143 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.445654 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.459571 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467284 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-cni-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467326 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-systemd\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467351 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-etc-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467377 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-var-lib-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467406 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-socket-dir-parent\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467429 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/238b8e33-ca8b-419a-b038-329ab97a3843-multus-daemon-config\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467468 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-systemd-units\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467492 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-slash\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467517 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467542 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5784d2-a874-4956-9d09-e923ac324925-ovn-node-metrics-cert\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467565 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-cni-bin\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467601 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-system-cni-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467626 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-cnibin\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467647 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-netns\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467671 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-cni-multus\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467691 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-etc-kubernetes\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467718 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45sfm\" (UniqueName: \"kubernetes.io/projected/6c5784d2-a874-4956-9d09-e923ac324925-kube-api-access-45sfm\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467746 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-netns\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467793 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-bin\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467828 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-multus-certs\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467853 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-netd\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467876 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-config\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467910 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-kubelet\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467934 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-ovn\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.467971 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468019 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-os-release\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468126 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/238b8e33-ca8b-419a-b038-329ab97a3843-cni-binary-copy\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468188 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-hostroot\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468220 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhzk8\" (UniqueName: \"kubernetes.io/projected/238b8e33-ca8b-419a-b038-329ab97a3843-kube-api-access-fhzk8\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468244 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-log-socket\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-conf-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468313 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-script-lib\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468342 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468370 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-env-overrides\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468406 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-k8s-cni-cncf-io\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468429 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-kubelet\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.468470 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-node-log\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.476368 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.489612 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.505843 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.542611 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.560996 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.569693 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-multus-certs\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.569749 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-netd\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.569771 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-config\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-config\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570522 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-multus-certs\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570554 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-netd\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570590 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-ovn\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570610 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-kubelet\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570646 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-os-release\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/238b8e33-ca8b-419a-b038-329ab97a3843-cni-binary-copy\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570689 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-hostroot\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570708 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhzk8\" (UniqueName: \"kubernetes.io/projected/238b8e33-ca8b-419a-b038-329ab97a3843-kube-api-access-fhzk8\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570758 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-log-socket\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570783 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-ovn\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570808 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570830 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-kubelet\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.570997 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-os-release\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571097 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-hostroot\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571246 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-log-socket\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571280 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-script-lib\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571302 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-conf-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571330 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571349 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-env-overrides\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571370 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-k8s-cni-cncf-io\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571410 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-k8s-cni-cncf-io\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571577 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-conf-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571629 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571984 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-env-overrides\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.571995 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/238b8e33-ca8b-419a-b038-329ab97a3843-cni-binary-copy\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572084 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-kubelet\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572134 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-node-log\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572421 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-cni-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572468 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-cni-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572504 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-systemd\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572499 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-kubelet\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572568 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-etc-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572534 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-etc-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572620 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-var-lib-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572591 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-script-lib\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572620 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-systemd\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-node-log\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572651 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/238b8e33-ca8b-419a-b038-329ab97a3843-multus-daemon-config\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572722 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-var-lib-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572746 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-socket-dir-parent\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572771 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572791 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-systemd-units\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572810 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-slash\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572831 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5784d2-a874-4956-9d09-e923ac324925-ovn-node-metrics-cert\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572852 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-cni-bin\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572893 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-cni-bin\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572903 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-systemd-units\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572925 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-slash\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.572979 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-multus-socket-dir-parent\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573061 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-cnibin\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573106 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-netns\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573180 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/238b8e33-ca8b-419a-b038-329ab97a3843-multus-daemon-config\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573018 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-openvswitch\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573199 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-cnibin\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573232 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-cni-multus\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573243 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-var-lib-cni-multus\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.574825 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-etc-kubernetes\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.574865 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-system-cni-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.574886 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45sfm\" (UniqueName: \"kubernetes.io/projected/6c5784d2-a874-4956-9d09-e923ac324925-kube-api-access-45sfm\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.574905 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-bin\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.574927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-netns\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.574989 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-netns\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.573658 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-host-run-netns\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.575020 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-etc-kubernetes\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.575056 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/238b8e33-ca8b-419a-b038-329ab97a3843-system-cni-dir\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.575386 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-bin\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.578430 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5784d2-a874-4956-9d09-e923ac324925-ovn-node-metrics-cert\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.587379 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.604024 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhzk8\" (UniqueName: \"kubernetes.io/projected/238b8e33-ca8b-419a-b038-329ab97a3843-kube-api-access-fhzk8\") pod \"multus-9kgdm\" (UID: \"238b8e33-ca8b-419a-b038-329ab97a3843\") " pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.606672 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45sfm\" (UniqueName: \"kubernetes.io/projected/6c5784d2-a874-4956-9d09-e923ac324925-kube-api-access-45sfm\") pod \"ovnkube-node-z8kl5\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.608352 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.621105 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.634159 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.643132 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.643167 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.643278 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.643481 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.647341 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.648167 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.649428 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.649530 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.650190 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.651228 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.651726 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.652338 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.653345 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.653984 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.654941 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.655473 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.656598 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.657148 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.657667 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.658577 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.659153 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.660231 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.660652 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.661244 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.662335 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.662848 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.663407 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.663810 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.664276 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.665281 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.665737 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.666363 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.667416 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.667890 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.668955 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.669453 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.670325 4669 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.670430 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.672071 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.672989 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.673549 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.675190 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.675849 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.676832 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.677480 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.678592 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.679171 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.679401 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.680456 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.681128 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.682160 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.682709 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.683848 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.684379 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.685495 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.685992 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.686856 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.687467 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.688376 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.688946 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.689426 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.693750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.702551 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9kgdm" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.709922 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.717592 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: W1001 11:28:51.726949 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238b8e33_ca8b_419a_b038_329ab97a3843.slice/crio-8430fb44e86f83a7b0750f5baafddfc61e7df074f6d1a129b36316020029d872 WatchSource:0}: Error finding container 8430fb44e86f83a7b0750f5baafddfc61e7df074f6d1a129b36316020029d872: Status 404 returned error can't find the container with id 8430fb44e86f83a7b0750f5baafddfc61e7df074f6d1a129b36316020029d872 Oct 01 11:28:51 crc kubenswrapper[4669]: W1001 11:28:51.728648 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c5784d2_a874_4956_9d09_e923ac324925.slice/crio-cbdf041ce88efe1f02c8d5fd5d3255c519a7efa5d979db042de2c0fcc6791d0e WatchSource:0}: Error finding container cbdf041ce88efe1f02c8d5fd5d3255c519a7efa5d979db042de2c0fcc6791d0e: Status 404 returned error can't find the container with id cbdf041ce88efe1f02c8d5fd5d3255c519a7efa5d979db042de2c0fcc6791d0e Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.747096 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.771863 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.784658 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.799904 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.814425 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.837127 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.867722 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.870349 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerStarted","Data":"7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.870416 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerStarted","Data":"8430fb44e86f83a7b0750f5baafddfc61e7df074f6d1a129b36316020029d872"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.870542 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.871809 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerStarted","Data":"35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.871866 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerStarted","Data":"1bed6bbc9e89cb9e07e05b53b390c0effb239e339e599b682f2125407687d5d6"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.872993 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" exitCode=0 Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.873070 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.873144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"cbdf041ce88efe1f02c8d5fd5d3255c519a7efa5d979db042de2c0fcc6791d0e"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.874700 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.874763 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.874776 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"daaa6a650f263e0b8fa1e24e0952648014bbc2df920971ea0584fda44e7b4453"} Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.875380 4669 scope.go:117] "RemoveContainer" containerID="79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4" Oct 01 11:28:51 crc kubenswrapper[4669]: E1001 11:28:51.875541 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.886423 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.886725 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.891063 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.901051 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.914144 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.930728 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.948723 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.961904 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.973331 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:51 crc kubenswrapper[4669]: I1001 11:28:51.991966 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:51Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.009216 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.042379 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.080998 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.122849 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.163699 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.201606 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.238892 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.277489 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.336755 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.364278 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.403168 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.440986 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.476437 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.526096 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.559823 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.597594 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.643631 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:52 crc kubenswrapper[4669]: E1001 11:28:52.643792 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.644325 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.678912 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.730411 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.757985 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.881982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.882463 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.882474 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.882484 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.882493 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.882502 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.883265 4669 generic.go:334] "Generic (PLEG): container finished" podID="6069cadd-c466-42b0-a195-f2b2537f17b6" containerID="35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001" exitCode=0 Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.883362 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerDied","Data":"35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.884586 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f"} Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.910539 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.933152 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.950586 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.971345 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:52 crc kubenswrapper[4669]: I1001 11:28:52.997390 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:52Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.013382 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.036275 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.078381 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.118639 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.160986 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.197700 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.239029 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.277646 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.319778 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.359605 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.396579 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.396890 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.396976 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:28:57.396954738 +0000 UTC m=+28.496519715 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.397621 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.397800 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.397852 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.398015 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398154 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:57.398118056 +0000 UTC m=+28.497683183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398014 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398189 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398207 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.398267 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398313 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398326 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398333 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398368 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:57.398340892 +0000 UTC m=+28.497905879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398395 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:57.398384223 +0000 UTC m=+28.497949210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398647 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.398691 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:28:57.39868106 +0000 UTC m=+28.498246037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.438828 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.484691 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.524401 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.558547 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.603374 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.643481 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.643516 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.643708 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:28:53 crc kubenswrapper[4669]: E1001 11:28:53.643884 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.644510 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.686608 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.719959 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.770606 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.801928 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.836581 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.878865 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.890488 4669 generic.go:334] "Generic (PLEG): container finished" podID="6069cadd-c466-42b0-a195-f2b2537f17b6" containerID="73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be" exitCode=0 Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.890600 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerDied","Data":"73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be"} Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.918612 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:53 crc kubenswrapper[4669]: I1001 11:28:53.973337 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:53Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.014121 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.038500 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.059947 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bf8lj"] Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.060478 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.076573 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.088198 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.108935 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.127702 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.149880 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.199717 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.205123 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f26e14fc-b10f-49ae-9639-6974d58e88ec-serviceca\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.205198 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f26e14fc-b10f-49ae-9639-6974d58e88ec-host\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.205228 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfkls\" (UniqueName: \"kubernetes.io/projected/f26e14fc-b10f-49ae-9639-6974d58e88ec-kube-api-access-gfkls\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.245214 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.279505 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.306686 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f26e14fc-b10f-49ae-9639-6974d58e88ec-serviceca\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.306764 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f26e14fc-b10f-49ae-9639-6974d58e88ec-host\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.306805 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfkls\" (UniqueName: \"kubernetes.io/projected/f26e14fc-b10f-49ae-9639-6974d58e88ec-kube-api-access-gfkls\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.307067 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f26e14fc-b10f-49ae-9639-6974d58e88ec-host\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.309001 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f26e14fc-b10f-49ae-9639-6974d58e88ec-serviceca\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.326968 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.349360 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfkls\" (UniqueName: \"kubernetes.io/projected/f26e14fc-b10f-49ae-9639-6974d58e88ec-kube-api-access-gfkls\") pod \"node-ca-bf8lj\" (UID: \"f26e14fc-b10f-49ae-9639-6974d58e88ec\") " pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.379757 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bf8lj" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.379764 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: W1001 11:28:54.401156 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26e14fc_b10f_49ae_9639_6974d58e88ec.slice/crio-551c80df986ce36d99a133bd30f1c3e79e9f899e9a070563cc1462be3de86bc8 WatchSource:0}: Error finding container 551c80df986ce36d99a133bd30f1c3e79e9f899e9a070563cc1462be3de86bc8: Status 404 returned error can't find the container with id 551c80df986ce36d99a133bd30f1c3e79e9f899e9a070563cc1462be3de86bc8 Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.427173 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.461423 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.499231 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.541173 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.583951 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.617570 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.643107 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:54 crc kubenswrapper[4669]: E1001 11:28:54.643271 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.662532 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.702121 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.742619 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.777590 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.819921 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.857798 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.898728 4669 generic.go:334] "Generic (PLEG): container finished" podID="6069cadd-c466-42b0-a195-f2b2537f17b6" containerID="88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb" exitCode=0 Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.898831 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerDied","Data":"88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb"} Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.901725 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bf8lj" event={"ID":"f26e14fc-b10f-49ae-9639-6974d58e88ec","Type":"ContainerStarted","Data":"6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da"} Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.901783 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bf8lj" event={"ID":"f26e14fc-b10f-49ae-9639-6974d58e88ec","Type":"ContainerStarted","Data":"551c80df986ce36d99a133bd30f1c3e79e9f899e9a070563cc1462be3de86bc8"} Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.909224 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.914768 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.940014 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:54 crc kubenswrapper[4669]: I1001 11:28:54.978036 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:54Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.020427 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.059938 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.101786 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.138923 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.177688 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.227362 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.256960 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.305937 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.343968 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.379210 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.427099 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.456498 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.499181 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.526581 4669 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.532741 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.532789 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.532802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.533038 4669 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.537989 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.590382 4669 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.590825 4669 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.592642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.592704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.592724 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.592752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.592770 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.607475 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.611210 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.611270 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.611287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.611310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.611328 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.621877 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.626920 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.631683 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.631729 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.631742 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.631762 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.631778 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.643371 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.643512 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.644010 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.644124 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.647851 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.653199 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.653262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.653276 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.653290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.653303 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.658978 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.668460 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.673007 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.673063 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.673101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.673125 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.673143 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.685646 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: E1001 11:28:55.685767 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.687771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.687831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.687842 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.687868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.687882 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.696869 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.739937 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.778536 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.791098 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.791141 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.791150 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.791164 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.791174 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.894352 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.894384 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.894394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.894410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.894422 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:55Z","lastTransitionTime":"2025-10-01T11:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.918134 4669 generic.go:334] "Generic (PLEG): container finished" podID="6069cadd-c466-42b0-a195-f2b2537f17b6" containerID="618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764" exitCode=0 Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.918174 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerDied","Data":"618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764"} Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.943125 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.962889 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:55 crc kubenswrapper[4669]: I1001 11:28:55.990014 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:55Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.000380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.000432 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.000444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.000469 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.000533 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.006334 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.024851 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.037681 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.055807 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.097137 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.104379 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.104424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.104433 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.104450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.104588 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.138188 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.182375 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.207732 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.207784 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.207796 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.207815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.207838 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.222689 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.260617 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.300621 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.311290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.311332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.311342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.311358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.311369 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.347353 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.400177 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.414168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.414227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.414242 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.414262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.414278 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.518334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.518381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.518392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.518409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.518421 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.621318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.621387 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.621409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.621446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.621471 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.643217 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:56 crc kubenswrapper[4669]: E1001 11:28:56.643398 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.725300 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.725360 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.725377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.725403 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.725422 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.828781 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.828831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.828843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.828863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.828877 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.932303 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerStarted","Data":"3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.934250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.934321 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.934346 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.934375 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.934400 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:56Z","lastTransitionTime":"2025-10-01T11:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.957192 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:56 crc kubenswrapper[4669]: I1001 11:28:56.983430 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:56Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.010523 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.029540 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.040063 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.040176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.040198 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.040223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.040242 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.053581 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.079152 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.109097 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.131744 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.142691 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.142731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.142740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.142756 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.142768 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.164795 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.181214 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.201311 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.221586 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.239357 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.246402 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.246464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.246482 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.246511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.246530 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.256125 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.270941 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.349849 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.350178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.350271 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.350339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.350399 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.439642 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.439938 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.439888463 +0000 UTC m=+36.539453470 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.440198 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.440408 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.440407 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.440495 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.440473607 +0000 UTC m=+36.540038614 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.440692 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.440836 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.440905 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441005 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.44097721 +0000 UTC m=+36.540542217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441163 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441221 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441243 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441341 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.441316878 +0000 UTC m=+36.540881865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441689 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.441834 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.442444 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.442572 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.442546429 +0000 UTC m=+36.542111606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.454493 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.454575 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.454594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.454618 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.454636 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.558724 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.558782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.558800 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.558824 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.558846 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.643855 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.643933 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.644151 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:28:57 crc kubenswrapper[4669]: E1001 11:28:57.644371 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.661280 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.661326 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.661344 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.661367 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.661388 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.764285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.764339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.764361 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.764388 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.764407 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.868000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.868114 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.868147 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.868196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.868222 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.942884 4669 generic.go:334] "Generic (PLEG): container finished" podID="6069cadd-c466-42b0-a195-f2b2537f17b6" containerID="3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638" exitCode=0 Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.942999 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerDied","Data":"3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.951841 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.952425 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.971637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.971766 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.971792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.971826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.971850 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:57Z","lastTransitionTime":"2025-10-01T11:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:57 crc kubenswrapper[4669]: I1001 11:28:57.985646 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.008014 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.024229 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.041945 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.042686 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.061318 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.074445 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.074482 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.074497 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.074517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.074532 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.086963 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.103330 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.123968 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.142388 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.159380 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.177136 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.177180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.177196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.177212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.177225 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.179258 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.198594 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.203470 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.204318 4669 scope.go:117] "RemoveContainer" containerID="79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4" Oct 01 11:28:58 crc kubenswrapper[4669]: E1001 11:28:58.204507 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.215346 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.230656 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.248199 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.270860 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.280891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.280961 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.280973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.281000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.281015 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.288533 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.301748 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.315785 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.331751 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.344849 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.354898 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.376753 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.384348 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.384395 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.384407 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.384427 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.384441 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.402564 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.418944 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.444266 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.464006 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.478832 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.487564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.487591 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.487600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.487616 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.487629 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.493108 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.512262 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.591738 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.591827 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.591850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.591885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.591905 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.643424 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:28:58 crc kubenswrapper[4669]: E1001 11:28:58.643667 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.695834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.695897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.695914 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.695943 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.695962 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.800311 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.800394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.800421 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.800456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.800481 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.910145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.910227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.910248 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.910281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.910302 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:58Z","lastTransitionTime":"2025-10-01T11:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.962483 4669 generic.go:334] "Generic (PLEG): container finished" podID="6069cadd-c466-42b0-a195-f2b2537f17b6" containerID="99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0" exitCode=0 Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.962612 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerDied","Data":"99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0"} Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.962767 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.963486 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:58 crc kubenswrapper[4669]: I1001 11:28:58.986038 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:58Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.008783 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.016186 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.016220 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.016235 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.016257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.016273 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.017698 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.027911 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.047943 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.062361 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.081337 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.104253 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.119738 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.120400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.120434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.120448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.120471 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.120486 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.140516 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.153821 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.167328 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.181981 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.199759 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.223399 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.223453 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.223469 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.223493 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.223510 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.229258 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.253584 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.275719 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.293785 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.316506 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.326462 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.326530 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.326556 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.326592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.326619 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.343249 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.362011 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.384126 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.419594 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.429794 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.429855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.429870 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.429895 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.429910 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.443628 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.477978 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.499853 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.517764 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.534524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.534688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.534731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.534761 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.534787 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.541632 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.566747 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.588634 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.610697 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.638578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.638661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.638686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.638723 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.638749 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.643512 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.643606 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:28:59 crc kubenswrapper[4669]: E1001 11:28:59.643726 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:28:59 crc kubenswrapper[4669]: E1001 11:28:59.643813 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.668955 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.694794 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.727056 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.742240 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.742294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.742312 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.742342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.742363 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.751750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.768012 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.786513 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.811429 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.846252 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.846310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.846331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.846358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.846380 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.847276 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.869908 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.893423 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.915772 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.933848 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.949638 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.949689 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.949703 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.949729 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.949743 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:28:59Z","lastTransitionTime":"2025-10-01T11:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.954528 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.972130 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.972519 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" event={"ID":"6069cadd-c466-42b0-a195-f2b2537f17b6","Type":"ContainerStarted","Data":"316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30"} Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.972695 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:28:59 crc kubenswrapper[4669]: I1001 11:28:59.988733 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:28:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.029961 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.053209 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.053294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.053318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.053352 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.053377 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.060523 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.099428 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.143190 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.156292 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.156334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.156347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.156365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.156380 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.183072 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.226547 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.260049 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.261526 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.261603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.261624 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.261654 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.261674 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.300005 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.343194 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.365467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.365522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.365539 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.365564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.365584 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.383518 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.419668 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.460935 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.468722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.468775 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.468795 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.468819 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.468839 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.500186 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.543960 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.573119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.573194 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.573214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.573240 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.573262 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.584770 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.643846 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:00 crc kubenswrapper[4669]: E1001 11:29:00.644112 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.677551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.677622 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.677645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.677678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.677699 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.781449 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.781537 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.781561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.781595 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.781617 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.884436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.884515 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.884537 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.884568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.884592 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.977306 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.987287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.987337 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.987354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.987378 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:00 crc kubenswrapper[4669]: I1001 11:29:00.987399 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:00Z","lastTransitionTime":"2025-10-01T11:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.091693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.091786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.091806 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.092375 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.092457 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.195330 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.195373 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.195384 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.195404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.195419 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.298622 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.298690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.298704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.298727 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.298743 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.402363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.402474 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.402501 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.402572 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.402599 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.506354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.506400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.506410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.506429 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.506440 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.609686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.609761 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.609786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.609816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.609835 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.643497 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.643573 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:01 crc kubenswrapper[4669]: E1001 11:29:01.643739 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:01 crc kubenswrapper[4669]: E1001 11:29:01.643880 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.717295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.717345 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.717358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.717381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.717395 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.820358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.820414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.820434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.820458 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.820475 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.924227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.924300 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.924322 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.924349 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:01 crc kubenswrapper[4669]: I1001 11:29:01.924370 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:01Z","lastTransitionTime":"2025-10-01T11:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.028318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.028392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.028412 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.028440 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.028460 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.132708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.133202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.133395 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.133537 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.133665 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.236504 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.236543 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.236553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.236568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.236581 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.340335 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.340411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.340430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.340458 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.340479 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.443901 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.443961 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.443975 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.443997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.444012 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.547608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.547672 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.547689 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.547716 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.547737 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.643688 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:02 crc kubenswrapper[4669]: E1001 11:29:02.643924 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.650719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.650764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.650774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.650793 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.650804 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.753460 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.753552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.753571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.753599 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.753620 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.856340 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.856398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.856411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.856434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.856445 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.960169 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.960272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.960305 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.960343 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.960366 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:02Z","lastTransitionTime":"2025-10-01T11:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.987776 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/0.log" Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.992695 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a" exitCode=1 Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.992763 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a"} Oct 01 11:29:02 crc kubenswrapper[4669]: I1001 11:29:02.994312 4669 scope.go:117] "RemoveContainer" containerID="c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.019779 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.041093 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.062104 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.064532 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.064589 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.064611 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.064638 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.064658 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.078705 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.101258 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.120698 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.142301 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.167629 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.167964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.168020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.168031 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.168049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.168416 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.188805 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.224480 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.244777 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.271545 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.273183 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.273233 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.273253 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.273274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.273287 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.284472 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.294008 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.304381 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:03Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.376805 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.376868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.376887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.376933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.376993 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.481191 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.481239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.481257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.481281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.481303 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.583976 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.584041 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.584063 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.584115 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.584134 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.644137 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.644169 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:03 crc kubenswrapper[4669]: E1001 11:29:03.644306 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:03 crc kubenswrapper[4669]: E1001 11:29:03.644410 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.687575 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.687630 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.687696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.687718 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.687730 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.790821 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.790881 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.790894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.790916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.790929 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.894136 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.894207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.894223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.894247 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.894264 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.996574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.996621 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.996634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.996658 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:03 crc kubenswrapper[4669]: I1001 11:29:03.996692 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:03Z","lastTransitionTime":"2025-10-01T11:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.000238 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/0.log" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.004663 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.004916 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.033599 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.050857 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5"] Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.051447 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: W1001 11:29:04.053193 4669 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd": failed to list *v1.Secret: secrets "ovn-kubernetes-control-plane-dockercfg-gs7dd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 01 11:29:04 crc kubenswrapper[4669]: E1001 11:29:04.053263 4669 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-gs7dd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-control-plane-dockercfg-gs7dd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 11:29:04 crc kubenswrapper[4669]: W1001 11:29:04.053459 4669 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert": failed to list *v1.Secret: secrets "ovn-control-plane-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 01 11:29:04 crc kubenswrapper[4669]: E1001 11:29:04.053523 4669 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-control-plane-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.058543 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.078343 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.096711 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.098891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.098934 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.098942 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.098959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.098977 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.111211 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.125184 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.130665 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6bac435-6175-448d-a057-faaa4fa8114b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.130772 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6bac435-6175-448d-a057-faaa4fa8114b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.131115 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6bac435-6175-448d-a057-faaa4fa8114b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.131159 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn2g7\" (UniqueName: \"kubernetes.io/projected/f6bac435-6175-448d-a057-faaa4fa8114b-kube-api-access-fn2g7\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.143028 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.159562 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.185324 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.201844 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.201897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.201909 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.201929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.201942 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.206238 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.223583 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.232709 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6bac435-6175-448d-a057-faaa4fa8114b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.232793 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn2g7\" (UniqueName: \"kubernetes.io/projected/f6bac435-6175-448d-a057-faaa4fa8114b-kube-api-access-fn2g7\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.232864 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6bac435-6175-448d-a057-faaa4fa8114b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.232924 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6bac435-6175-448d-a057-faaa4fa8114b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.234062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6bac435-6175-448d-a057-faaa4fa8114b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.234395 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6bac435-6175-448d-a057-faaa4fa8114b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.241286 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.258278 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn2g7\" (UniqueName: \"kubernetes.io/projected/f6bac435-6175-448d-a057-faaa4fa8114b-kube-api-access-fn2g7\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.259804 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.278781 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.292742 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.304484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.304523 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.304535 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.304553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.304564 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.307880 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.343915 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.360862 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.375194 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.393152 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.407554 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.407607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.407619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.407638 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.407652 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.412668 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.437960 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.459423 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.479818 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.494993 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.511547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.511641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.511656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.511682 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.511701 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.514390 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.532039 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.549367 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.567592 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.588050 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.611882 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.614735 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.614774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.614815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.614837 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.614851 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.644149 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:04 crc kubenswrapper[4669]: E1001 11:29:04.644391 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.718234 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.718330 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.718366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.718407 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.718435 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.792820 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wvnw6"] Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.794043 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:04 crc kubenswrapper[4669]: E1001 11:29:04.794198 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.819984 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.822539 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.822629 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.822659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.822695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.822723 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.839778 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.839932 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgm6k\" (UniqueName: \"kubernetes.io/projected/30ba513f-67c5-4e4f-b8a7-be9c67660bec-kube-api-access-kgm6k\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.841048 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.868210 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.883994 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.899121 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.915051 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.926546 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.926606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.926622 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.926645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.926663 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:04Z","lastTransitionTime":"2025-10-01T11:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.934587 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.941056 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:04 crc kubenswrapper[4669]: E1001 11:29:04.941282 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:04 crc kubenswrapper[4669]: E1001 11:29:04.941416 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.441384725 +0000 UTC m=+36.540949912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.941286 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgm6k\" (UniqueName: \"kubernetes.io/projected/30ba513f-67c5-4e4f-b8a7-be9c67660bec-kube-api-access-kgm6k\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.974281 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.978393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgm6k\" (UniqueName: \"kubernetes.io/projected/30ba513f-67c5-4e4f-b8a7-be9c67660bec-kube-api-access-kgm6k\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:04 crc kubenswrapper[4669]: I1001 11:29:04.995012 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:04Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.012956 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/1.log" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.014071 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/0.log" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.019019 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16" exitCode=1 Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.019125 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.019186 4669 scope.go:117] "RemoveContainer" containerID="c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.020354 4669 scope.go:117] "RemoveContainer" containerID="123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.020649 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.025422 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.029746 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.029794 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.029811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.029836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.029855 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.050551 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.058206 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.073872 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.095755 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.116735 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.134429 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.134490 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.134508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.134536 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.134556 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.134755 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.157575 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.177935 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.195238 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.240496 4669 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-control-plane-metrics-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.240614 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bac435-6175-448d-a057-faaa4fa8114b-ovn-control-plane-metrics-cert podName:f6bac435-6175-448d-a057-faaa4fa8114b nodeName:}" failed. No retries permitted until 2025-10-01 11:29:05.740581491 +0000 UTC m=+36.840146498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-control-plane-metrics-cert" (UniqueName: "kubernetes.io/secret/f6bac435-6175-448d-a057-faaa4fa8114b-ovn-control-plane-metrics-cert") pod "ovnkube-control-plane-749d76644c-cknp5" (UID: "f6bac435-6175-448d-a057-faaa4fa8114b") : failed to sync secret cache: timed out waiting for the condition Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.243393 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.243449 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.243467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.243492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.243512 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.258770 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.279383 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.303554 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.323840 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.346828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.346881 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.346900 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.346932 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.346952 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.362546 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.384023 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.397944 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.402741 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.422020 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.445050 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.448371 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.448528 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.448610 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:29:21.448564749 +0000 UTC m=+52.548129766 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.448761 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.448870 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.448935 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:21.448917437 +0000 UTC m=+52.548482444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449005 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449047 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449073 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449186 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:21.449161883 +0000 UTC m=+52.548726900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.449460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.449522 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.449579 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449700 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449737 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449778 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:29:06.449752658 +0000 UTC m=+37.549317665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449782 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449824 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449846 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449886 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:21.44983843 +0000 UTC m=+52.549403447 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.449947 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:21.449931722 +0000 UTC m=+52.549496729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.450990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.451060 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.451116 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.451152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.451212 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.486190 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.506386 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.529811 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.550169 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.555044 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.555134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.555156 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.555187 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.555207 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.571182 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.593602 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.609748 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.644060 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.644067 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.644318 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.644482 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.658764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.658846 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.658868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.658899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.658926 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.752430 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6bac435-6175-448d-a057-faaa4fa8114b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.756893 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6bac435-6175-448d-a057-faaa4fa8114b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknp5\" (UID: \"f6bac435-6175-448d-a057-faaa4fa8114b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.761557 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.761617 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.761634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.761659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.761676 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.863378 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.865470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.865540 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.865560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.865590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.865609 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: W1001 11:29:05.893698 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6bac435_6175_448d_a057_faaa4fa8114b.slice/crio-b7110a712830b803d0430eb5ea48c28b9187fcf49c4bdb90e173bd04e56f830f WatchSource:0}: Error finding container b7110a712830b803d0430eb5ea48c28b9187fcf49c4bdb90e173bd04e56f830f: Status 404 returned error can't find the container with id b7110a712830b803d0430eb5ea48c28b9187fcf49c4bdb90e173bd04e56f830f Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.938292 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.938354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.938367 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.938391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.938406 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.958801 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.963898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.963958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.963975 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.964001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.964019 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:05 crc kubenswrapper[4669]: E1001 11:29:05.987358 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:05Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.993135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.993195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.993214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.993241 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:05 crc kubenswrapper[4669]: I1001 11:29:05.993261 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:05Z","lastTransitionTime":"2025-10-01T11:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.015001 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:06Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.021847 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.021902 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.021923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.021949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.021966 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.025452 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/1.log" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.031365 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" event={"ID":"f6bac435-6175-448d-a057-faaa4fa8114b","Type":"ContainerStarted","Data":"b7110a712830b803d0430eb5ea48c28b9187fcf49c4bdb90e173bd04e56f830f"} Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.043743 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:06Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.049347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.049407 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.049426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.049450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.049469 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.070519 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:06Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.070682 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.073660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.073724 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.073751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.073790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.073816 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.176786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.176848 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.176867 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.176891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.176909 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.280635 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.280713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.280739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.280771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.280813 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.384454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.384505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.384523 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.384551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.384571 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.460523 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.460820 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.460969 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:29:08.460935798 +0000 UTC m=+39.560500805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.489051 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.489138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.489157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.489186 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.489209 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.592217 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.592257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.592267 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.592285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.592335 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.643807 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.643977 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.644091 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:06 crc kubenswrapper[4669]: E1001 11:29:06.644240 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.695489 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.695555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.695573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.695600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.695619 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.798951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.799001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.799017 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.799036 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.799049 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.902405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.902478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.902495 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.902521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:06 crc kubenswrapper[4669]: I1001 11:29:06.902541 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:06Z","lastTransitionTime":"2025-10-01T11:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.005594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.005656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.005673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.005703 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.005723 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.043723 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" event={"ID":"f6bac435-6175-448d-a057-faaa4fa8114b","Type":"ContainerStarted","Data":"4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.043798 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" event={"ID":"f6bac435-6175-448d-a057-faaa4fa8114b","Type":"ContainerStarted","Data":"12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.066004 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.084842 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.105123 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.108656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.108709 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.108728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.108754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.108773 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.126637 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.143502 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.166975 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.186208 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.204751 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.211738 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.211809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.211826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.211851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.211875 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.223952 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.244883 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.259415 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.280146 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.298123 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.315332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.315396 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.315410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.315436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.315454 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.331118 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.350827 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.388471 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.406465 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:07Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.418059 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.418172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.418190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.418218 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.418238 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.543408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.543462 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.543475 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.543496 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.543508 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.643466 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.643602 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:07 crc kubenswrapper[4669]: E1001 11:29:07.643728 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:07 crc kubenswrapper[4669]: E1001 11:29:07.644268 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.646788 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.646836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.646846 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.646860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.646874 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.750475 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.750518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.750529 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.750545 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.750556 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.854326 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.854380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.854391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.854410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.854423 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.957868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.957947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.957967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.957995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:07 crc kubenswrapper[4669]: I1001 11:29:07.958017 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:07Z","lastTransitionTime":"2025-10-01T11:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.061564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.061621 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.061633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.061656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.061680 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.164140 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.164191 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.164205 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.164223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.164235 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.267197 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.267255 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.267268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.267292 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.267304 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.369754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.369825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.369923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.369958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.369980 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.472758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.472807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.472816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.472831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.472842 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.483586 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:08 crc kubenswrapper[4669]: E1001 11:29:08.483759 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:08 crc kubenswrapper[4669]: E1001 11:29:08.483825 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:29:12.48380233 +0000 UTC m=+43.583367307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.576400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.576466 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.576488 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.576512 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.576530 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.643618 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.643722 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:08 crc kubenswrapper[4669]: E1001 11:29:08.643875 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:08 crc kubenswrapper[4669]: E1001 11:29:08.644049 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.679984 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.680059 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.680117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.680148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.680167 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.783286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.783330 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.783343 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.783361 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.783372 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.886733 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.886797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.886811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.886831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.886844 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.989933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.990004 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.990024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.990052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:08 crc kubenswrapper[4669]: I1001 11:29:08.990099 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:08Z","lastTransitionTime":"2025-10-01T11:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.093029 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.093158 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.093178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.093237 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.093260 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.196976 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.197037 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.197105 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.197135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.197153 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.300290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.300413 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.300439 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.300473 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.300499 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.403712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.404203 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.404319 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.404358 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.404380 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.506968 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.507017 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.507031 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.507053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.507067 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.609782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.609817 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.609831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.609848 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.609858 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.643572 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.643818 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:09 crc kubenswrapper[4669]: E1001 11:29:09.643930 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:09 crc kubenswrapper[4669]: E1001 11:29:09.644036 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.667810 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.682869 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.713893 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.713963 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.713990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.714022 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.714047 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.717067 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.742342 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.762965 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.785272 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.809221 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.820222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.820276 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.820293 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.820316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.820335 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.845976 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.864680 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.888326 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.913730 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.923895 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.923954 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.923971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.923999 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.924042 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:09Z","lastTransitionTime":"2025-10-01T11:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.939746 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:09 crc kubenswrapper[4669]: I1001 11:29:09.971332 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.003471 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:09Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.027325 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.027321 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:10Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.027370 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.027547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.027584 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.027595 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.044754 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:10Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.060812 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:10Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.130163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.130202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.130214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.130231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.130245 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.233310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.233349 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.233360 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.233378 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.233389 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.336522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.336569 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.336582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.336602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.336615 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.440119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.440169 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.440183 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.440202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.440216 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.543132 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.543188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.543232 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.543257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.543269 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.643401 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:10 crc kubenswrapper[4669]: E1001 11:29:10.643563 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.643417 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:10 crc kubenswrapper[4669]: E1001 11:29:10.643853 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.645716 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.645743 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.645757 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.645773 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.645784 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.749110 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.749170 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.749184 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.749206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.749222 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.852596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.852651 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.852662 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.852685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.852700 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.956645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.956712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.956730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.956760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:10 crc kubenswrapper[4669]: I1001 11:29:10.956783 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:10Z","lastTransitionTime":"2025-10-01T11:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.059606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.059827 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.059842 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.059865 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.059880 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.163212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.163282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.163305 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.163340 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.163363 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.267179 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.267267 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.267289 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.267322 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.267345 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.370689 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.370779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.370811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.370847 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.370871 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.473811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.473853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.473866 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.473883 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.473897 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.578058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.578159 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.578175 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.578204 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.578219 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.643453 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.643584 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:11 crc kubenswrapper[4669]: E1001 11:29:11.643743 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:11 crc kubenswrapper[4669]: E1001 11:29:11.643916 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.681893 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.681964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.681981 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.682010 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.682033 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.785863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.785925 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.785935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.785953 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.785964 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.889368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.889436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.889455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.889483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.889504 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.992861 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.992928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.992951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.992982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:11 crc kubenswrapper[4669]: I1001 11:29:11.993004 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:11Z","lastTransitionTime":"2025-10-01T11:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.096246 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.096338 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.096372 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.096404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.096432 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.200285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.200377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.200408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.200438 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.200461 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.304698 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.304759 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.304780 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.304807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.304827 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.408935 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.409021 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.409038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.409061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.409100 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.511927 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.511991 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.512009 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.512035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.512054 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.536628 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:12 crc kubenswrapper[4669]: E1001 11:29:12.536807 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:12 crc kubenswrapper[4669]: E1001 11:29:12.536921 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:29:20.536895995 +0000 UTC m=+51.636460972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.615615 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.615710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.615732 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.615764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.615791 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.643419 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:12 crc kubenswrapper[4669]: E1001 11:29:12.643596 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.643419 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:12 crc kubenswrapper[4669]: E1001 11:29:12.644511 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.719699 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.719755 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.719770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.719794 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.719811 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.822832 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.822924 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.822946 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.822974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.822997 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.925972 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.926023 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.926035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.926052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:12 crc kubenswrapper[4669]: I1001 11:29:12.926067 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:12Z","lastTransitionTime":"2025-10-01T11:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.029822 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.030153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.030258 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.030326 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.030394 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.134166 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.134241 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.134260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.134289 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.134311 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.238091 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.238133 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.238143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.238161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.238170 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.341574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.341938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.342148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.342319 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.342511 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.445865 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.445948 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.445968 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.445998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.446022 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.549395 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.549781 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.550287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.550661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.551019 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.643869 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:13 crc kubenswrapper[4669]: E1001 11:29:13.644116 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.644273 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:13 crc kubenswrapper[4669]: E1001 11:29:13.644596 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.645643 4669 scope.go:117] "RemoveContainer" containerID="79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.653315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.653357 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.653366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.653384 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.653395 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.756621 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.756676 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.756693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.756714 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.756728 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.859832 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.859879 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.859891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.859906 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.859916 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.963454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.963594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.963616 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.963644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:13 crc kubenswrapper[4669]: I1001 11:29:13.963668 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:13Z","lastTransitionTime":"2025-10-01T11:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.068576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.068636 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.068648 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.068668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.068682 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.076989 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.084197 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.085208 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.107739 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.145424 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c044f14e4db07928433d0fe64b83646d0eb995ce27c486c05179e760d1def05a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:02Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:02.002136 5951 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 11:29:02.002188 5951 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:02.002205 5951 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:02.002228 5951 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 11:29:02.002248 5951 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 11:29:02.002280 5951 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:02.002297 5951 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:02.002274 5951 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 11:29:02.002315 5951 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:02.002398 5951 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 11:29:02.002417 5951 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 11:29:02.002422 5951 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 11:29:02.002459 5951 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 11:29:02.002468 5951 factory.go:656] Stopping watch factory\\\\nI1001 11:29:02.002484 5951 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.165055 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.171578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.171613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.171623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.171640 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.171650 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.196455 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.213655 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.226328 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.240728 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.258433 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.274656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.274923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.275140 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.275346 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.275505 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.276989 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.294570 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.309336 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.321998 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.343193 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.355569 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.369923 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.378120 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.378286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.378366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.378464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.378539 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.384192 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.402452 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:14Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.481206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.481270 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.481286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.481313 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.481327 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.585273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.585319 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.585328 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.585344 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.585358 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.644266 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.644275 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:14 crc kubenswrapper[4669]: E1001 11:29:14.644874 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:14 crc kubenswrapper[4669]: E1001 11:29:14.645167 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.689054 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.689142 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.689160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.689188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.689207 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.793101 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.793168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.793189 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.793214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.793234 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.896918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.896995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.897021 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.897057 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:14 crc kubenswrapper[4669]: I1001 11:29:14.897135 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:14Z","lastTransitionTime":"2025-10-01T11:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.000726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.000786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.000806 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.000831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.000854 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.104295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.104363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.104380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.104407 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.104425 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.207564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.208020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.208186 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.208331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.208477 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.311574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.311982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.312157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.312310 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.312436 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.415627 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.415728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.415752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.415786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.415842 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.519316 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.519404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.519423 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.519454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.519472 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.622050 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.622105 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.622117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.622134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.622143 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.643915 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.643949 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:15 crc kubenswrapper[4669]: E1001 11:29:15.644047 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:15 crc kubenswrapper[4669]: E1001 11:29:15.644268 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.724860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.725397 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.725409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.725427 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.725438 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.828108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.828168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.828182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.828202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.828215 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.931195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.931239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.931250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.931269 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:15 crc kubenswrapper[4669]: I1001 11:29:15.931345 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:15Z","lastTransitionTime":"2025-10-01T11:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.035143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.035186 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.035198 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.035213 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.035224 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.139719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.139783 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.139804 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.139831 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.139851 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.242972 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.243038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.243056 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.243120 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.243145 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.346402 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.346468 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.346480 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.346507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.346520 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.405595 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.405691 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.405758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.405791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.405815 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.425016 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:16Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.431107 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.431180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.431192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.431212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.431228 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.447658 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:16Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.453286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.453315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.453324 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.453357 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.453369 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.472959 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:16Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.477646 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.477690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.477702 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.477733 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.477749 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.494368 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:16Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.498699 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.498764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.498780 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.498802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.498816 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.513744 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:16Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.513919 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.515697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.515742 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.515756 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.515781 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.515795 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.619051 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.619148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.619168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.619195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.619216 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.643419 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.643525 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.643613 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:16 crc kubenswrapper[4669]: E1001 11:29:16.643735 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.722771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.722841 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.722860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.722887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.722908 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.826353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.826419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.826436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.826473 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.826506 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.930422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.930541 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.930552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.930575 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:16 crc kubenswrapper[4669]: I1001 11:29:16.930587 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:16Z","lastTransitionTime":"2025-10-01T11:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.033787 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.033867 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.033885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.033916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.033935 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.137351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.137419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.137443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.137478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.137500 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.240813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.240887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.240905 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.240932 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.240951 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.344560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.344608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.344618 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.344634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.344645 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.448719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.448791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.448808 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.448836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.448855 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.551461 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.551549 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.551559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.551576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.551587 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.643657 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.643828 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:17 crc kubenswrapper[4669]: E1001 11:29:17.643863 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:17 crc kubenswrapper[4669]: E1001 11:29:17.644071 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.654314 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.654387 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.654410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.654441 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.654463 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.757340 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.757394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.757419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.757445 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.757462 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.860523 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.860565 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.860576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.860594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.860607 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.964046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.964135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.964154 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.964180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:17 crc kubenswrapper[4669]: I1001 11:29:17.964199 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:17Z","lastTransitionTime":"2025-10-01T11:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.067912 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.067998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.068018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.068047 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.068068 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.170731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.170782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.170792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.170811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.170825 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.273850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.273954 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.273986 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.274025 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.274053 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.377669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.377739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.377758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.377787 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.377808 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.481041 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.481108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.481118 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.481138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.481150 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.583871 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.583952 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.583977 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.584011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.584051 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.643841 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.643858 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:18 crc kubenswrapper[4669]: E1001 11:29:18.644454 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:18 crc kubenswrapper[4669]: E1001 11:29:18.644596 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.644787 4669 scope.go:117] "RemoveContainer" containerID="123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.658031 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.674387 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.687917 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.687995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.688027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.688049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.688065 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.694487 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.728353 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.748764 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.778925 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.790950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.790990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.791000 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.791019 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.791033 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.797226 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.814656 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.829551 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.846799 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.865138 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.881129 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.894507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.894542 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.894551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.894588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.894599 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:18Z","lastTransitionTime":"2025-10-01T11:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.898308 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.911170 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.934678 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.951240 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:18 crc kubenswrapper[4669]: I1001 11:29:18.974357 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:18Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.010399 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.010482 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.010497 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.010523 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.010538 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.117692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.118145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.118218 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.118288 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.118350 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.120883 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/1.log" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.123978 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.124218 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.149679 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.168041 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.189527 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.205671 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.219351 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.221222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.221264 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.221274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.221290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.221302 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.232547 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.244976 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.264400 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.280169 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.298357 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.314153 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.324866 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.324916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.324930 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.324956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.324973 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.329544 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.349632 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.369235 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.381747 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.397380 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.415711 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.428254 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.428294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.428305 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.428324 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.428336 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.454452 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.531707 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.531992 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.532178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.532347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.532497 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.635247 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.635294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.635307 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.635329 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.635344 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.654283 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.654421 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:19 crc kubenswrapper[4669]: E1001 11:29:19.654551 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:19 crc kubenswrapper[4669]: E1001 11:29:19.654735 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.671260 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.686780 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.718365 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.740352 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.740410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.740444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.740471 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.740491 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.741570 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.759654 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.778547 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.803032 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.823934 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.841047 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.844383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.844555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.844916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.845203 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.845449 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.863114 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.877865 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.890017 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.901248 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.923310 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.939270 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.950176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.950230 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.950241 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.950260 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.950275 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:19Z","lastTransitionTime":"2025-10-01T11:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.952703 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:19 crc kubenswrapper[4669]: I1001 11:29:19.964388 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:19Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.052983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.053023 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.053052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.053069 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.053092 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.130893 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/2.log" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.131988 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/1.log" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.135867 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5" exitCode=1 Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.135930 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.135979 4669 scope.go:117] "RemoveContainer" containerID="123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.142501 4669 scope.go:117] "RemoveContainer" containerID="1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5" Oct 01 11:29:20 crc kubenswrapper[4669]: E1001 11:29:20.142842 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.156693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.156742 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.156763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.156790 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.156810 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.160533 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.180239 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.196344 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.212969 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.233877 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.263779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.263826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.263845 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.263870 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.263894 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.264801 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.286413 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.303359 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.319552 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.338592 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.364670 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://123899b5596c14a3000d3273ea70f143bc4c3124e334308996c8bfa9815bac16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 11:29:04.451266 6133 services_controller.go:434] Service openshift-kube-storage-version-migrator-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-storage-version-migrator-operator e1639a86-fb7f-46de-9d5e-4aee16dccea1 4372 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-storage-version-migrator-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc005e69dd7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:htt\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.366381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.366430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.366447 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.366472 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.366491 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.385165 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.409665 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.428462 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.449144 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.466966 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.472007 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.472124 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.472153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.472188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.472212 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.495341 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:20Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.576998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.577149 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.577173 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.577201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.577221 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.637462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:20 crc kubenswrapper[4669]: E1001 11:29:20.637726 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:20 crc kubenswrapper[4669]: E1001 11:29:20.637919 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:29:36.637840195 +0000 UTC m=+67.737405212 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.643188 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:20 crc kubenswrapper[4669]: E1001 11:29:20.643377 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.643185 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:20 crc kubenswrapper[4669]: E1001 11:29:20.643548 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.680378 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.680446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.680463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.680491 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.680510 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.783834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.783894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.783913 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.783941 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.783960 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.887419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.887760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.887853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.887948 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.888038 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.991435 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.991487 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.991498 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.991518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:20 crc kubenswrapper[4669]: I1001 11:29:20.991532 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:20Z","lastTransitionTime":"2025-10-01T11:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.095494 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.095549 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.095565 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.095590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.095607 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.147600 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/2.log" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.154633 4669 scope.go:117] "RemoveContainer" containerID="1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5" Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.155415 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.177495 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.196533 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.198732 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.198800 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.198823 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.198852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.198873 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.215483 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.234556 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.258728 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.292036 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.302645 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.302698 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.302709 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.302734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.302748 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.308782 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.335165 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.352573 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.371126 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.392273 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.405771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.406137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.406272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.406385 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.406494 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.416706 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.441950 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.462615 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.477428 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.494625 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.509606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.509675 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.509686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.509704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.509717 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.510463 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:21Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.548224 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.548492 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:29:53.548450276 +0000 UTC m=+84.648015403 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.548621 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.548788 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.548809 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.548851 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:53.548839845 +0000 UTC m=+84.648405062 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.549158 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.549200 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.549220 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.549315 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:53.549286326 +0000 UTC m=+84.648851333 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.550209 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.550290 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:53.55027247 +0000 UTC m=+84.649837487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.550071 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.550457 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.550617 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.550658 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.550679 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.550757 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:29:53.550735701 +0000 UTC m=+84.650300708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.612653 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.612708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.612722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.612742 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.612759 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.644383 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.644453 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.644617 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:21 crc kubenswrapper[4669]: E1001 11:29:21.645125 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.715250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.715302 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.715317 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.715337 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.715352 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.818298 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.818376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.818401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.818429 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.818451 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.923550 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.923944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.924028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.924188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:21 crc kubenswrapper[4669]: I1001 11:29:21.924314 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:21Z","lastTransitionTime":"2025-10-01T11:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.027711 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.028648 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.028670 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.028700 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.028719 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.133119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.133184 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.133200 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.133223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.133239 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.238273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.238368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.238390 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.238419 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.238436 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.313896 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.330680 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.332315 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.343619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.343688 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.343707 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.343735 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.343756 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.348795 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.369201 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.403989 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.424560 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.442900 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.446714 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.446749 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.446759 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.446777 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.446789 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.460801 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.479032 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.523274 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.549846 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.549903 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.549919 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.549938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.549951 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.580490 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.603124 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.619003 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.634813 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.643429 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.643564 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:22 crc kubenswrapper[4669]: E1001 11:29:22.643602 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:22 crc kubenswrapper[4669]: E1001 11:29:22.643916 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.649809 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.652607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.652663 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.652678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.652695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.652707 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.667763 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.681106 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.698396 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:22Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.755666 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.755713 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.755733 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.755754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.755771 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.858762 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.858808 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.858820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.858838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.858852 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.962807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.962886 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.962903 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.962932 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:22 crc kubenswrapper[4669]: I1001 11:29:22.962951 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:22Z","lastTransitionTime":"2025-10-01T11:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.067464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.067560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.067578 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.067607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.067631 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.172446 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.172502 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.172522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.172547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.172562 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.276110 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.276161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.276231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.276281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.276297 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.378771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.378815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.378825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.378840 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.378850 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.481547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.481633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.481647 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.481668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.481709 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.584686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.584744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.584758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.584779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.584792 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.644017 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.644024 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:23 crc kubenswrapper[4669]: E1001 11:29:23.644247 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:23 crc kubenswrapper[4669]: E1001 11:29:23.644402 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.687275 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.687331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.687342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.687360 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.687371 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.791053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.791110 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.791123 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.791142 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.791154 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.893911 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.894012 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.894027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.894051 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.894063 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.996459 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.996526 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.996548 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.996574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:23 crc kubenswrapper[4669]: I1001 11:29:23.996597 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:23Z","lastTransitionTime":"2025-10-01T11:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.099409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.099464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.099475 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.099495 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.099509 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.202809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.202853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.202880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.202900 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.202915 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.306983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.307037 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.307055 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.307118 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.307138 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.410264 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.410331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.410351 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.410377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.410395 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.513995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.514413 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.514568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.514708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.514839 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.618306 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.618395 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.618414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.618444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.618468 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.643875 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.643888 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:24 crc kubenswrapper[4669]: E1001 11:29:24.644472 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:24 crc kubenswrapper[4669]: E1001 11:29:24.644520 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.721707 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.721768 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.721783 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.721805 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.721823 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.826418 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.826484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.826500 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.826524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.826546 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.930056 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.930371 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.930518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.930625 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:24 crc kubenswrapper[4669]: I1001 11:29:24.930707 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:24Z","lastTransitionTime":"2025-10-01T11:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.034829 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.035245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.035392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.035510 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.035634 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.139020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.139108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.139127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.139153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.139172 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.242135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.242214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.242243 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.242270 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.242290 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.345579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.345987 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.346165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.346326 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.346455 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.449938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.450013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.450027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.450045 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.450061 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.553587 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.554037 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.554545 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.554747 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.554954 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.644434 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.644480 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:25 crc kubenswrapper[4669]: E1001 11:29:25.644711 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:25 crc kubenswrapper[4669]: E1001 11:29:25.644737 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.658162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.658275 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.658294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.658322 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.658340 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.761155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.761223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.761248 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.761281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.761307 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.865366 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.865740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.865829 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.865907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.865981 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.969222 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.969308 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.969327 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.969764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:25 crc kubenswrapper[4669]: I1001 11:29:25.969792 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:25Z","lastTransitionTime":"2025-10-01T11:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.073545 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.073894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.073959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.074109 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.074175 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.177279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.177684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.177755 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.177836 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.177897 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.281794 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.282616 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.282731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.282855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.282946 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.385958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.386017 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.386035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.386061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.386123 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.489053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.489161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.489180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.489256 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.489277 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.552406 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.552959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.553157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.553300 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.553463 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.571878 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:26Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.577176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.577321 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.577416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.577527 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.577607 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.592279 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:26Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.597931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.597998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.598015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.598036 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.598050 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.617715 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:26Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.622627 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.622667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.622677 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.622692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.622703 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.643494 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.643661 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.643668 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.643692 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:26Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.644246 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.648878 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.648923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.648942 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.648962 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.648980 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.665250 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:26Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:26 crc kubenswrapper[4669]: E1001 11:29:26.665490 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.667905 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.667961 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.667982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.668010 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.668031 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.771619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.771683 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.771706 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.771734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.771754 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.875571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.875647 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.875667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.875692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.875713 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.979123 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.979153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.979161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.979174 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:26 crc kubenswrapper[4669]: I1001 11:29:26.979183 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:26Z","lastTransitionTime":"2025-10-01T11:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.083159 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.083268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.083286 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.083313 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.083333 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.186675 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.187072 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.187227 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.187387 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.187486 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.291352 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.291397 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.291408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.291430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.291443 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.395880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.395946 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.395970 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.395999 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.396019 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.499764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.499820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.499845 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.499876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.499894 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.603208 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.603690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.603853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.603998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.604188 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.643305 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.643248 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:27 crc kubenswrapper[4669]: E1001 11:29:27.643493 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:27 crc kubenswrapper[4669]: E1001 11:29:27.643609 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.707467 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.707530 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.707550 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.707576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.707599 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.811272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.811352 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.811378 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.811414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.811444 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.914899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.914956 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.914974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.914998 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:27 crc kubenswrapper[4669]: I1001 11:29:27.915020 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:27Z","lastTransitionTime":"2025-10-01T11:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.018899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.018964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.018975 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.018997 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.019009 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.122253 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.122333 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.122357 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.122391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.122416 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.230565 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.231268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.231365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.232037 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.232208 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.335385 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.335433 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.335442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.335460 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.335472 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.439593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.440183 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.440376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.440572 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.440754 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.544676 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.544740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.544763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.544793 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.544814 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.643192 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.643423 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:28 crc kubenswrapper[4669]: E1001 11:29:28.643627 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:28 crc kubenswrapper[4669]: E1001 11:29:28.643745 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.648544 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.648801 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.648887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.648913 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.648982 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.752964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.753403 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.753547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.753732 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.753910 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.857343 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.857381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.857389 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.857417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.857426 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.959898 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.959965 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.959987 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.960019 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:28 crc kubenswrapper[4669]: I1001 11:29:28.960049 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:28Z","lastTransitionTime":"2025-10-01T11:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.064704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.064751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.064766 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.064786 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.064800 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.168011 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.168117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.168137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.168165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.168186 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.273338 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.273408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.273429 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.273461 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.273482 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.377353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.377455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.377521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.377550 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.377609 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.480614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.480708 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.480727 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.480751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.480770 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.583919 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.583968 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.583978 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.583995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.584007 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.643249 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.643255 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:29 crc kubenswrapper[4669]: E1001 11:29:29.643439 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:29 crc kubenswrapper[4669]: E1001 11:29:29.643547 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.672539 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.688298 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.688408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.688436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.688470 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.688494 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.692363 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.713217 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.737142 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.767493 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.791628 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.792814 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.792867 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.792884 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.792905 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.792918 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.826521 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.849644 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.870722 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.886515 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.896461 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.896501 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.896517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.896536 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.896554 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:29Z","lastTransitionTime":"2025-10-01T11:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.902598 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.922564 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.938027 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.952022 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.966799 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.980255 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:29 crc kubenswrapper[4669]: I1001 11:29:29.995060 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:29Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.000950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.001012 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.001071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.001133 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.001149 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.019985 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:30Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.103907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.103983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.104003 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.104034 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.104058 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.207971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.208016 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.208027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.208044 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.208060 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.312271 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.312335 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.312354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.312381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.312401 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.416337 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.416422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.416448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.416484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.416511 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.519781 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.519862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.519874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.519894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.519917 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.623897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.624004 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.624021 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.624051 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.624072 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.651738 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:30 crc kubenswrapper[4669]: E1001 11:29:30.652607 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.652398 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:30 crc kubenswrapper[4669]: E1001 11:29:30.653053 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.728276 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.728359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.728385 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.728423 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.728460 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.832553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.832607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.832620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.832643 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.832660 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.936445 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.936511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.936526 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.936543 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:30 crc kubenswrapper[4669]: I1001 11:29:30.936553 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:30Z","lastTransitionTime":"2025-10-01T11:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.040093 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.040141 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.040151 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.040171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.040183 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.143539 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.143597 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.143608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.143625 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.143635 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.246726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.246798 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.246813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.246840 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.246860 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.350259 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.350332 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.350383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.350411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.350433 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.453944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.454010 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.454033 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.454058 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.454123 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.557599 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.557673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.557692 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.557719 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.557739 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.643519 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.643629 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:31 crc kubenswrapper[4669]: E1001 11:29:31.643796 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:31 crc kubenswrapper[4669]: E1001 11:29:31.643979 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.660574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.660642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.660668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.660697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.660719 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.764308 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.764383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.764404 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.764434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.764457 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.867277 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.867350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.867373 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.867411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.867435 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.970673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.970752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.970780 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.970812 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:31 crc kubenswrapper[4669]: I1001 11:29:31.970839 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:31Z","lastTransitionTime":"2025-10-01T11:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.074726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.074806 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.074825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.074854 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.074876 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.178856 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.178923 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.178942 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.178969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.178993 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.282704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.282766 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.282788 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.282819 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.282844 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.385245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.385318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.385343 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.385377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.385403 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.421234 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.437425 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.461213 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.480607 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.488881 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.488929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.488943 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.488967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.488982 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.513940 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.536342 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.554059 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.571710 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.593141 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.593605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.593658 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.593669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.593691 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.593702 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.618810 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.635790 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.643143 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.643143 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:32 crc kubenswrapper[4669]: E1001 11:29:32.643347 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:32 crc kubenswrapper[4669]: E1001 11:29:32.643476 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.644285 4669 scope.go:117] "RemoveContainer" containerID="1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5" Oct 01 11:29:32 crc kubenswrapper[4669]: E1001 11:29:32.644499 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.658005 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.678907 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.696398 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.697386 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.697443 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.697463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.697492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.697514 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.714250 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.726494 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.741396 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.752250 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.800614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.801020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.801108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.801175 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.801243 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.801787 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:32Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.904325 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.904370 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.904382 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.904402 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:32 crc kubenswrapper[4669]: I1001 11:29:32.904416 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:32Z","lastTransitionTime":"2025-10-01T11:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.006787 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.007161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.007171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.007185 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.007195 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.109173 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.109214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.109223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.109239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.109251 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.212813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.213292 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.213400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.213517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.213623 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.317045 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.317134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.317147 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.317167 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.317183 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.420021 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.420135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.420161 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.420190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.420214 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.524313 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.524389 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.524412 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.524445 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.524471 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.628302 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.628712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.628822 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.628944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.629069 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.643611 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.643683 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:33 crc kubenswrapper[4669]: E1001 11:29:33.644071 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:33 crc kubenswrapper[4669]: E1001 11:29:33.644294 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.732355 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.732416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.732430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.732453 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.732470 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.835356 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.835478 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.835540 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.835608 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.835669 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.939784 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.940180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.940291 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.940388 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:33 crc kubenswrapper[4669]: I1001 11:29:33.940475 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:33Z","lastTransitionTime":"2025-10-01T11:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.045041 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.045391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.045815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.046224 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.046648 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.150436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.150494 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.150508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.150533 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.150551 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.254511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.254912 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.255038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.255186 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.255300 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.358394 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.358436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.358448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.358466 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.358481 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.461660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.461710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.461726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.461748 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.461765 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.565192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.565245 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.565259 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.565278 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.565293 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.643304 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.643304 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:34 crc kubenswrapper[4669]: E1001 11:29:34.643983 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:34 crc kubenswrapper[4669]: E1001 11:29:34.644209 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.669266 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.669331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.669350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.669379 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.669399 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.771782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.771851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.771873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.771900 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.771919 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.875071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.875173 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.875191 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.875221 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.875241 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.978655 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.978733 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.978752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.978779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:34 crc kubenswrapper[4669]: I1001 11:29:34.978799 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:34Z","lastTransitionTime":"2025-10-01T11:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.082931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.083749 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.084242 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.084435 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.084567 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.188046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.188152 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.188163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.188180 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.188190 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.291335 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.291573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.291661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.291750 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.291817 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.395547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.395587 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.395596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.395610 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.395619 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.499820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.499853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.499861 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.499881 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.499892 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.601969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.602015 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.602024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.602040 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.602049 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.643895 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.643896 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:35 crc kubenswrapper[4669]: E1001 11:29:35.644102 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:35 crc kubenswrapper[4669]: E1001 11:29:35.644141 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.704710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.705045 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.705134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.705201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.705280 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.808511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.808874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.809027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.809178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.809287 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.913005 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.913061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.913091 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.913111 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:35 crc kubenswrapper[4669]: I1001 11:29:35.913124 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:35Z","lastTransitionTime":"2025-10-01T11:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.015880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.015944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.015959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.015980 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.015998 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.119816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.119868 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.119878 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.119897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.119910 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.223062 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.223663 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.223948 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.224447 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.224810 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.331418 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.333746 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.333960 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.334212 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.334395 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.440605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.441003 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.441241 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.441644 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.441779 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.545290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.545607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.545715 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.545814 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.545890 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.638359 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.638523 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.638591 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:30:08.638572928 +0000 UTC m=+99.738137905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.643915 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.643968 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.644052 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.644151 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.648984 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.649018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.649030 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.649048 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.649065 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.735693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.735746 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.735758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.735775 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.735787 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.749743 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:36Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.753929 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.753992 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.754002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.754026 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.754037 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.766472 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:36Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.770887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.770953 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.770968 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.770990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.771003 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.787746 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:36Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.792459 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.792541 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.792559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.792594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.792621 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.809494 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:36Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.814725 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.814782 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.814793 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.814812 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.814829 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.828139 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:36Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:36 crc kubenswrapper[4669]: E1001 11:29:36.828313 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.834033 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.834127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.834144 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.834167 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.834193 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.938008 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.938097 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.938117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.938142 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:36 crc kubenswrapper[4669]: I1001 11:29:36.938163 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:36Z","lastTransitionTime":"2025-10-01T11:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.040875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.040938 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.040952 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.040972 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.040988 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.143992 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.144044 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.144054 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.144074 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.144104 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.245817 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.245852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.245860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.245873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.245882 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.348686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.348761 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.348774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.348798 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.348816 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.451897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.451940 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.451951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.451967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.451976 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.555798 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.555853 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.555862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.555881 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.555897 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.643848 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.643948 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:37 crc kubenswrapper[4669]: E1001 11:29:37.644010 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:37 crc kubenswrapper[4669]: E1001 11:29:37.644211 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.659343 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.659420 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.659432 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.659451 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.659464 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.765851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.765949 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.765958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.766135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.766149 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.869690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.869769 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.869811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.869844 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.869868 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.972875 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.972930 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.972944 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.972962 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:37 crc kubenswrapper[4669]: I1001 11:29:37.972977 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:37Z","lastTransitionTime":"2025-10-01T11:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.076322 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.076369 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.076383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.076401 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.076417 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.179744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.179806 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.179820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.179841 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.179856 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.283149 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.283192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.283208 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.283229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.283245 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.386517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.386574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.386591 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.386615 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.386632 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.492980 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.493106 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.493129 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.493155 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.493174 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.595585 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.595744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.595776 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.595810 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.595834 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.643179 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.643220 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:38 crc kubenswrapper[4669]: E1001 11:29:38.643377 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:38 crc kubenswrapper[4669]: E1001 11:29:38.643513 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.698799 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.698869 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.698880 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.698899 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.698915 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.802164 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.802228 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.802251 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.802282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.802302 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.909859 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.909905 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.909913 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.909931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:38 crc kubenswrapper[4669]: I1001 11:29:38.909942 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:38Z","lastTransitionTime":"2025-10-01T11:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.013229 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.013285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.013297 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.013319 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.013332 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.116683 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.116745 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.116764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.116785 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.116801 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.222294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.222346 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.222362 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.222381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.222393 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.325207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.325262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.325276 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.325294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.325305 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.429336 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.429818 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.429931 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.430057 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.430244 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.533601 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.533680 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.533698 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.533726 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.533746 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.637020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.637162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.637192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.637268 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.637340 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.643391 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.643527 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:39 crc kubenswrapper[4669]: E1001 11:29:39.643641 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:39 crc kubenswrapper[4669]: E1001 11:29:39.643826 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.663556 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.679826 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.697192 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.712006 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.732008 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.741039 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.741281 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.741379 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.741493 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.741580 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.756813 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.771476 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.783467 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.794282 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.808716 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.830878 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.846489 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.846537 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.846549 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.846568 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.846467 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.846580 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.862566 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.878779 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.892788 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.905738 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.919912 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.937058 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:39Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.949018 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.949071 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.949108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.949128 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:39 crc kubenswrapper[4669]: I1001 11:29:39.949139 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:39Z","lastTransitionTime":"2025-10-01T11:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.052121 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.052496 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.052593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.052703 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.052789 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.156312 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.156600 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.156733 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.156835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.156935 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.228223 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/0.log" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.228271 4669 generic.go:334] "Generic (PLEG): container finished" podID="238b8e33-ca8b-419a-b038-329ab97a3843" containerID="7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471" exitCode=1 Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.228302 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerDied","Data":"7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.228667 4669 scope.go:117] "RemoveContainer" containerID="7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.248373 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.262512 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.262592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.262615 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.262649 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.262673 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.266464 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.282144 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.296677 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.314663 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.331559 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:39Z\\\",\\\"message\\\":\\\"2025-10-01T11:28:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d\\\\n2025-10-01T11:28:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d to /host/opt/cni/bin/\\\\n2025-10-01T11:28:53Z [verbose] multus-daemon started\\\\n2025-10-01T11:28:53Z [verbose] Readiness Indicator file check\\\\n2025-10-01T11:29:38Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.356736 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.365639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.365668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.365676 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.365693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.365704 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.371119 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.390783 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.407568 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.421707 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.433613 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.452733 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.469400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.469460 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.469477 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.469504 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.469523 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.473528 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.490750 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.505494 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.518177 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.532056 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:40Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.572902 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.572951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.572977 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.572995 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.573009 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.643316 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.643355 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:40 crc kubenswrapper[4669]: E1001 11:29:40.643511 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:40 crc kubenswrapper[4669]: E1001 11:29:40.643695 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.675132 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.675178 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.675189 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.675206 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.675225 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.777758 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.777793 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.777801 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.777817 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.777828 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.881459 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.881515 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.881527 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.881544 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.881556 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.984199 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.984258 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.984271 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.984293 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:40 crc kubenswrapper[4669]: I1001 11:29:40.984310 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:40Z","lastTransitionTime":"2025-10-01T11:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.087778 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.087859 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.087876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.087906 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.087923 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.191969 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.192020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.192030 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.192050 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.192060 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.235447 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/0.log" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.235536 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerStarted","Data":"7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.251952 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.267631 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.283024 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:39Z\\\",\\\"message\\\":\\\"2025-10-01T11:28:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d\\\\n2025-10-01T11:28:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d to /host/opt/cni/bin/\\\\n2025-10-01T11:28:53Z [verbose] multus-daemon started\\\\n2025-10-01T11:28:53Z [verbose] Readiness Indicator file check\\\\n2025-10-01T11:29:38Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.294479 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.294539 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.294551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.294574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.294588 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.302998 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.317156 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.339514 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.354604 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.367539 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.383560 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.395764 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.398166 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.398239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.398257 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.398284 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.398303 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.415703 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.431443 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.449292 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.472742 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.483853 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.498813 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.501025 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.501096 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.501108 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.501127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.501140 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.514621 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.536096 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:41Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.604552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.604641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.604666 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.604699 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.604724 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.643368 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.643521 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:41 crc kubenswrapper[4669]: E1001 11:29:41.643729 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:41 crc kubenswrapper[4669]: E1001 11:29:41.644008 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.707770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.707812 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.707825 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.707841 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.707853 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.810959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.811013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.811024 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.811046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.811060 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.914405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.914456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.914466 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.914485 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:41 crc kubenswrapper[4669]: I1001 11:29:41.914497 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:41Z","lastTransitionTime":"2025-10-01T11:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.017626 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.017704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.017731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.017764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.017788 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.120783 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.120833 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.120843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.120860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.120871 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.224751 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.224802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.224816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.224833 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.224845 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.328731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.328800 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.328813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.328835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.328852 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.431855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.431906 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.431918 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.431954 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.431972 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.534621 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.534670 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.534684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.534706 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.534720 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.637862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.637934 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.637951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.637975 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.637994 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.643391 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.643547 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:42 crc kubenswrapper[4669]: E1001 11:29:42.643667 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:42 crc kubenswrapper[4669]: E1001 11:29:42.643851 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.740629 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.740774 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.740784 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.740800 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.740809 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.843865 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.844510 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.844525 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.844540 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.844551 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.947053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.947163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.947186 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.947217 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:42 crc kubenswrapper[4669]: I1001 11:29:42.947240 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:42Z","lastTransitionTime":"2025-10-01T11:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.050207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.050263 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.050277 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.050304 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.050318 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.153239 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.153290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.153301 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.153320 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.153330 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.256815 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.257121 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.257143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.257169 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.257185 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.360547 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.360620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.360639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.360667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.360692 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.464297 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.464341 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.464353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.464370 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.464381 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.567807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.567878 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.567897 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.567926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.567948 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.643980 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.644015 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:43 crc kubenswrapper[4669]: E1001 11:29:43.644229 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:43 crc kubenswrapper[4669]: E1001 11:29:43.644371 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.676127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.676214 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.676236 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.676279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.676300 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.779364 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.779437 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.779456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.779483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.779502 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.882529 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.882593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.882605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.882627 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.882643 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.985964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.986016 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.986027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.986048 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:43 crc kubenswrapper[4669]: I1001 11:29:43.986061 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:43Z","lastTransitionTime":"2025-10-01T11:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.090049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.090150 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.090168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.090195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.090214 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.193926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.194010 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.194028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.194116 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.194136 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.296792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.296847 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.296861 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.296884 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.296897 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.400275 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.400309 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.400318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.400333 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.400343 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.502970 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.503017 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.503027 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.503049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.503063 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.606476 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.606555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.606579 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.606609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.606634 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.644038 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.644119 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:44 crc kubenswrapper[4669]: E1001 11:29:44.644281 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:44 crc kubenswrapper[4669]: E1001 11:29:44.644750 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.645254 4669 scope.go:117] "RemoveContainer" containerID="1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.710171 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.710250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.710262 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.710279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.710289 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.813197 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.813283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.813322 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.813359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.813386 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.916492 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.916538 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.916551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.916572 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:44 crc kubenswrapper[4669]: I1001 11:29:44.916587 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:44Z","lastTransitionTime":"2025-10-01T11:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.018921 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.018971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.018981 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.019001 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.019012 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.121770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.121834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.121852 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.121876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.121895 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.225704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.225770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.225794 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.225823 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.225850 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.329049 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.329266 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.329285 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.329315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.329334 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.431720 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.431824 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.431846 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.431872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.431894 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.535710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.535851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.535915 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.535971 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.535994 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.639381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.639445 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.639472 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.639501 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.639524 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.644198 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.644234 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:45 crc kubenswrapper[4669]: E1001 11:29:45.644411 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:45 crc kubenswrapper[4669]: E1001 11:29:45.644572 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.743860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.743914 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.743932 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.744162 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.744180 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.847279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.847339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.847356 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.847381 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.847400 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.951530 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.951590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.951614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.951639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:45 crc kubenswrapper[4669]: I1001 11:29:45.951658 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:45Z","lastTransitionTime":"2025-10-01T11:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.054123 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.054159 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.054168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.054184 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.054193 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.157756 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.157816 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.157834 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.157857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.157875 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.259027 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/2.log" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.260514 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.260614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.260634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.260662 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.260681 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.263463 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.264128 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.285525 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.310920 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.324847 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.339054 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.358198 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:39Z\\\",\\\"message\\\":\\\"2025-10-01T11:28:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d\\\\n2025-10-01T11:28:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d to /host/opt/cni/bin/\\\\n2025-10-01T11:28:53Z [verbose] multus-daemon started\\\\n2025-10-01T11:28:53Z [verbose] Readiness Indicator file check\\\\n2025-10-01T11:29:38Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.363464 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.363524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.363542 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.363571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.363590 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.384447 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.404831 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.435772 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.453739 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.466661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.466738 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.466763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.466801 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.466829 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.474261 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.493548 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.513272 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.537261 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.557492 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.570363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.570420 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.570433 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.570454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.570469 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.581717 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.603161 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.621160 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.640114 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:46Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.643475 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.643560 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:46 crc kubenswrapper[4669]: E1001 11:29:46.643633 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:46 crc kubenswrapper[4669]: E1001 11:29:46.643798 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.674259 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.674319 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.674329 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.674347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.674360 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.777576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.777646 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.777668 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.777699 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.777725 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.881673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.881737 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.881754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.881779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.881797 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.985484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.985581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.985605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.985637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:46 crc kubenswrapper[4669]: I1001 11:29:46.985660 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:46Z","lastTransitionTime":"2025-10-01T11:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.022724 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.022801 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.022820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.022850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.022868 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.046245 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:47Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.051696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.051744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.051762 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.051784 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.051803 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.073194 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:47Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.080347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.080391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.080409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.080433 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.080452 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.105397 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:47Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.111463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.111518 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.111535 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.111561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.111579 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.135782 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:47Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.141420 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.141473 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.141489 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.141509 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.141527 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.161624 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:47Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.161854 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.164323 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.164397 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.164422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.164448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.164469 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.278179 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.278804 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.278838 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.278870 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.278895 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.381602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.381686 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.381715 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.381750 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.381771 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.486324 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.486377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.486391 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.486410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.486424 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.589656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.589696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.589707 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.589728 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.589742 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.646260 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.646403 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.646629 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:47 crc kubenswrapper[4669]: E1001 11:29:47.646839 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.692633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.692684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.692693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.692710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.692722 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.795221 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.795267 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.795279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.795298 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.795314 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.898734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.898978 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.899046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.899160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:47 crc kubenswrapper[4669]: I1001 11:29:47.899245 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:47Z","lastTransitionTime":"2025-10-01T11:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.002350 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.002439 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.002457 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.002484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.002503 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.106851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.106927 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.106948 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.106983 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.107008 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.209858 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.209943 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.209973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.210007 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.210034 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.314201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.314602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.314673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.314742 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.314929 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.418607 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.418675 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.418693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.418722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.418744 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.522336 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.522739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.522828 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.522914 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.522990 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.626368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.626418 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.626432 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.626453 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.626467 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.643991 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.644228 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:48 crc kubenswrapper[4669]: E1001 11:29:48.644391 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:48 crc kubenswrapper[4669]: E1001 11:29:48.644659 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.729704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.729743 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.729754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.729771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.729782 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.831993 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.832129 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.832165 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.832195 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.832215 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.934560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.934616 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.934634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.934657 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:48 crc kubenswrapper[4669]: I1001 11:29:48.934673 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:48Z","lastTransitionTime":"2025-10-01T11:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.038340 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.038396 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.038406 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.038426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.038440 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.140888 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.140941 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.140953 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.140973 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.140987 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.244347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.244392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.244405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.244425 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.244436 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.286310 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/3.log" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.286887 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/2.log" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.289643 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" exitCode=1 Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.289709 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.289752 4669 scope.go:117] "RemoveContainer" containerID="1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.290726 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:29:49 crc kubenswrapper[4669]: E1001 11:29:49.290938 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.309406 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.327536 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.347952 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.348002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.348012 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.348035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.348046 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.351059 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:48Z\\\",\\\"message\\\":\\\"al\\\\nI1001 11:29:48.091051 6709 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:48.091059 6709 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:48.091052 6709 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 11:29:48.091117 6709 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:48.091209 6709 factory.go:656] Stopping watch factory\\\\nI1001 11:29:48.091473 6709 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:48.091903 6709 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 11:29:48.091926 6709 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 11:29:48.091936 6709 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:48.091949 6709 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:48.092002 6709 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:48.092027 6709 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.368278 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.388144 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.410781 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.424334 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.438441 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.450813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.450862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.450874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.450894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.451221 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.453855 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:39Z\\\",\\\"message\\\":\\\"2025-10-01T11:28:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d\\\\n2025-10-01T11:28:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d to /host/opt/cni/bin/\\\\n2025-10-01T11:28:53Z [verbose] multus-daemon started\\\\n2025-10-01T11:28:53Z [verbose] Readiness Indicator file check\\\\n2025-10-01T11:29:38Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.465558 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.479567 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.492789 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.505788 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.518961 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.531049 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.543181 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.554325 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.554400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.554422 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.554770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.554978 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.555205 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.566888 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.643590 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:49 crc kubenswrapper[4669]: E1001 11:29:49.643788 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.644136 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:49 crc kubenswrapper[4669]: E1001 11:29:49.644249 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.657753 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.657974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.658172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.658219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.658244 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.660891 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.678219 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.691626 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.707181 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.724920 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.739755 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:39Z\\\",\\\"message\\\":\\\"2025-10-01T11:28:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d\\\\n2025-10-01T11:28:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d to /host/opt/cni/bin/\\\\n2025-10-01T11:28:53Z [verbose] multus-daemon started\\\\n2025-10-01T11:28:53Z [verbose] Readiness Indicator file check\\\\n2025-10-01T11:29:38Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.760461 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.760529 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.760541 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.760560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.760593 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.766194 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:48Z\\\",\\\"message\\\":\\\"al\\\\nI1001 11:29:48.091051 6709 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:48.091059 6709 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:48.091052 6709 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 11:29:48.091117 6709 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:48.091209 6709 factory.go:656] Stopping watch factory\\\\nI1001 11:29:48.091473 6709 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:48.091903 6709 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 11:29:48.091926 6709 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 11:29:48.091936 6709 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:48.091949 6709 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:48.092002 6709 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:48.092027 6709 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.781251 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.805941 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.820448 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.831261 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.846819 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.860428 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.864792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.864850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.864864 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.864884 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.864896 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.875939 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.889267 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.900973 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.914224 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.926987 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:49Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.967860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.967925 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.967937 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.967958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:49 crc kubenswrapper[4669]: I1001 11:29:49.967971 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:49Z","lastTransitionTime":"2025-10-01T11:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.071531 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.071641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.071653 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.071669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.071679 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.174780 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.174835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.174849 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.174866 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.174878 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.277964 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.278037 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.278048 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.278068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.278099 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.295623 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/3.log" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.381344 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.381434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.381444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.381463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.381497 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.484558 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.484635 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.484656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.484684 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.484704 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.588424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.588484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.588499 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.588525 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.588539 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.643567 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.643571 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:50 crc kubenswrapper[4669]: E1001 11:29:50.643742 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:50 crc kubenswrapper[4669]: E1001 11:29:50.643850 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.691373 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.691436 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.691449 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.691473 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.691487 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.794469 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.794982 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.794999 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.795020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.795035 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.897991 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.898043 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.898052 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.898090 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:50 crc kubenswrapper[4669]: I1001 11:29:50.898101 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:50Z","lastTransitionTime":"2025-10-01T11:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.001244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.001327 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.001363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.001398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.001422 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.104571 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.104612 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.104624 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.104641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.104652 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.207601 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.207714 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.207732 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.207760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.207780 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.310265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.310347 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.310369 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.310400 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.310420 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.413823 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.413885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.413903 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.413933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.413955 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.517990 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.518039 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.518054 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.518092 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.518105 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.626143 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.626856 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.626891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.626927 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.626950 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.643381 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:51 crc kubenswrapper[4669]: E1001 11:29:51.643603 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.644177 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:51 crc kubenswrapper[4669]: E1001 11:29:51.644392 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.730811 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.730849 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.730858 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.730872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.730882 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.834383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.834424 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.834434 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.834452 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.834465 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.937456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.937538 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.937556 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.937588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:51 crc kubenswrapper[4669]: I1001 11:29:51.937607 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:51Z","lastTransitionTime":"2025-10-01T11:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.041243 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.041312 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.041334 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.041363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.041381 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.144471 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.144516 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.144529 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.144560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.144570 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.247221 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.247295 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.247306 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.247329 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.247339 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.349972 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.350053 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.350134 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.350168 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.350191 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.453808 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.453874 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.453891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.453917 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.453935 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.557190 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.557318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.557363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.557403 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.557424 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.734008 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:52 crc kubenswrapper[4669]: E1001 11:29:52.734505 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.734748 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:52 crc kubenswrapper[4669]: E1001 11:29:52.734877 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.734757 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:52 crc kubenswrapper[4669]: E1001 11:29:52.735066 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.737064 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.737139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.737153 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.737177 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.737189 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.840788 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.840835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.840845 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.840863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.840872 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.944623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.944661 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.944670 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.944696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:52 crc kubenswrapper[4669]: I1001 11:29:52.944709 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:52Z","lastTransitionTime":"2025-10-01T11:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.047202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.047261 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.047274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.047294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.047309 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.151553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.151635 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.151660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.151693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.151715 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.254236 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.254273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.254283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.254299 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.254308 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.357302 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.357353 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.357365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.357384 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.357396 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.460062 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.460127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.460136 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.460154 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.460168 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.562583 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.562652 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.562677 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.562711 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.562734 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.643780 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.643991 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.648970 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.649047 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649152 4669 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649163 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.649130156 +0000 UTC m=+148.748695143 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.649262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649325 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.649306781 +0000 UTC m=+148.748871968 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.649357 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.649398 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649459 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649481 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649500 4669 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649505 4669 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649559 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.649546457 +0000 UTC m=+148.749111454 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649581 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649602 4669 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649616 4669 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649584 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.649573378 +0000 UTC m=+148.749138605 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 11:29:53 crc kubenswrapper[4669]: E1001 11:29:53.649677 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.64965749 +0000 UTC m=+148.749222477 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.664769 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.664805 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.664820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.664835 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.664845 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.767392 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.767447 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.767462 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.767485 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.767501 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.869967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.870009 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.870020 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.870036 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.870048 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.973584 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.973678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.973697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.973730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:53 crc kubenswrapper[4669]: I1001 11:29:53.973751 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:53Z","lastTransitionTime":"2025-10-01T11:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.076752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.076827 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.076844 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.076873 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.076891 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.179863 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.179916 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.179926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.179943 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.179954 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.283851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.283928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.283947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.283981 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.284004 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.388124 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.388176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.388189 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.388207 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.388221 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.492196 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.492244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.492253 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.492272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.492284 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.595337 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.595951 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.596415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.596620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.596795 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.644121 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:54 crc kubenswrapper[4669]: E1001 11:29:54.644273 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.644739 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:54 crc kubenswrapper[4669]: E1001 11:29:54.645181 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.644941 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:54 crc kubenswrapper[4669]: E1001 11:29:54.645839 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.700667 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.701157 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.701339 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.701476 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.701616 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.805030 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.805145 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.805159 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.805181 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.805198 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.909025 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.909105 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.909118 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.909137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:54 crc kubenswrapper[4669]: I1001 11:29:54.909149 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:54Z","lastTransitionTime":"2025-10-01T11:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.012274 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.012666 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.012731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.012807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.012874 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.116596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.116716 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.116739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.116772 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.116796 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.221850 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.223176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.223329 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.223483 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.223618 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.326784 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.326870 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.326896 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.326928 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.326954 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.430450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.430517 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.430531 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.430553 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.430565 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.534491 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.534560 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.534583 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.534620 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.534640 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.638398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.638454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.638468 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.638507 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.638520 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.644013 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:55 crc kubenswrapper[4669]: E1001 11:29:55.644156 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.741577 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.741641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.741651 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.741672 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.741687 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.845529 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.845598 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.845613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.845635 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.845659 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.948659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.948720 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.948737 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.948763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:55 crc kubenswrapper[4669]: I1001 11:29:55.948782 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:55Z","lastTransitionTime":"2025-10-01T11:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.052851 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.052942 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.052957 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.052976 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.052994 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.155693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.155765 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.155789 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.155820 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.155844 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.259559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.259619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.259638 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.259662 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.259686 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.363304 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.363388 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.363414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.363444 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.363466 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.467244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.467389 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.467411 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.467442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.467464 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.570456 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.570511 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.570524 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.570548 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.570561 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.644014 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.644123 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:56 crc kubenswrapper[4669]: E1001 11:29:56.644243 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.644349 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:56 crc kubenswrapper[4669]: E1001 11:29:56.644516 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:56 crc kubenswrapper[4669]: E1001 11:29:56.644943 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.673633 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.673704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.673731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.673763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.673795 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.778121 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.778188 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.778202 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.778224 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.778238 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.881510 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.881581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.881605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.881631 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.881649 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.985604 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.985673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.985694 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.985721 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:56 crc kubenswrapper[4669]: I1001 11:29:56.985746 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:56Z","lastTransitionTime":"2025-10-01T11:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.089382 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.089450 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.089463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.089484 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.089498 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.182042 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.182169 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.182194 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.182231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.182255 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.217947 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.226581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.226634 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.226655 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.226685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.226707 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.260652 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.269570 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.269632 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.269649 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.269671 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.269691 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.284051 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.288176 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.288238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.288247 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.288263 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.288276 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.300153 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.303035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.303068 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.303096 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.303112 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.303123 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.314154 4669 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dc18bf25-42dc-48ae-8a75-ee49cd1f6a6a\\\",\\\"systemUUID\\\":\\\"117c455f-c374-48da-bb29-55b6929cd967\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:57Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.314272 4669 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.315577 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.315604 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.315614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.315631 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.315642 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.419862 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.419933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.419950 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.419977 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.419996 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.522959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.523023 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.523046 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.523109 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.523135 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.626681 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.626747 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.626764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.626792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.626811 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.643390 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:57 crc kubenswrapper[4669]: E1001 11:29:57.643565 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.730035 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.730100 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.730115 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.730137 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.730151 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.833496 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.833564 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.833581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.833606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.833625 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.936754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.936826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.936848 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.936879 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:57 crc kubenswrapper[4669]: I1001 11:29:57.936907 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:57Z","lastTransitionTime":"2025-10-01T11:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.040486 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.040534 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.040551 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.040573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.040592 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.143455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.143522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.143545 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.143573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.143595 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.246948 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.247038 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.247119 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.247150 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.247169 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.350669 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.350735 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.350746 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.350764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.350778 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.453463 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.453523 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.453540 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.453567 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.453590 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.556384 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.556430 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.556442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.556458 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.556471 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.643186 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.643267 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.643231 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:29:58 crc kubenswrapper[4669]: E1001 11:29:58.643419 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:29:58 crc kubenswrapper[4669]: E1001 11:29:58.643623 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:29:58 crc kubenswrapper[4669]: E1001 11:29:58.643767 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.659287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.659342 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.659359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.659382 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.659401 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.763626 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.763693 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.763712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.763740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.763760 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.866296 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.866369 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.866386 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.866410 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.866431 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.969192 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.969265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.969283 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.969308 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:58 crc kubenswrapper[4669]: I1001 11:29:58.969326 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:58Z","lastTransitionTime":"2025-10-01T11:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.072860 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.072936 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.072953 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.072984 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.073003 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.175689 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.175730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.175744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.175760 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.175772 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.279533 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.279592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.279606 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.279623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.279636 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.382377 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.382468 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.382498 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.382530 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.382554 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.485505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.485582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.485601 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.485639 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.485660 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.589576 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.589704 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.589734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.589768 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.589792 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.643901 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:29:59 crc kubenswrapper[4669]: E1001 11:29:59.644207 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.665308 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2033fd0ddd7a62a1222cda42a935457e2b29163efe4e559b6b8b7c3bb17ad8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c3268f951dcab3a7f60548bfa9bf52319fb672dda34278ca73e457e49210281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.685545 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bf8lj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f26e14fc-b10f-49ae-9639-6974d58e88ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f629eed969d3f93adecf28ca165a42d4c1c297b091f0cc08dcd422ba41733da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfkls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bf8lj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.695170 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.695246 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.695267 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.695298 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.695321 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.708242 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6bac435-6175-448d-a057-faaa4fa8114b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12899ab62b95407c31be410ec6e05e9961cc10f1ef93ec28cde0b1c8d334b4be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4baebc06da2fa10fa84fde7a65ec3152f7552dd9310fe8d84f3b038e0e88f6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn2g7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.728380 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://369a34a7a00c60a896df146fb0c36cc63bf9ddd8bd2e4d4afbb0fcd122732b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.752029 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fsthv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6069cadd-c466-42b0-a195-f2b2537f17b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://316e1157555697327771c7094901b46130ce6cfd7b62a22a30073d37c9c3de30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a38c0437dcf90d1ec69998198c4733724010f3308609bc7003c28ccdf9d001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73288bcc8eab9407d8fcf74259a892371d9f94a94037e6ba98adbe87569f32be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88e8387204216c6a2d9e5e5fbc05a912efcada0669f128f90935b7ad6fd50afb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://618d8fa8d48afe3ee24aa7b7d46f25bd04388c345d15dca4afb6c0a3c99e9764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b21db0734f5379f925c16d0bda5784530d1fd9824ba03ccd2dc09737c99d638\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99c821258e6d1cf0e004c196d0d8906ddff4dfc1a95a79fcc9d207c819e254e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k997j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fsthv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.770127 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmmr7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f297cf4-7106-4aee-af55-e0a404e56b39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d91c5918c686b894842455b7c7163814a8a83cbf1ae941f0f32e55fd98b1f219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlzkl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmmr7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.789487 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ce1c610f7eb7f0f2b4ba832e1abfcf76ada2cd4ba09244785a7ae292b7ebe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns6wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5rfqz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.797892 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.797947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.797970 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.798002 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.798029 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.812837 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9kgdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"238b8e33-ca8b-419a-b038-329ab97a3843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:39Z\\\",\\\"message\\\":\\\"2025-10-01T11:28:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d\\\\n2025-10-01T11:28:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8340d86d-14be-4431-8e0a-3bb70a14221d to /host/opt/cni/bin/\\\\n2025-10-01T11:28:53Z [verbose] multus-daemon started\\\\n2025-10-01T11:28:53Z [verbose] Readiness Indicator file check\\\\n2025-10-01T11:29:38Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhzk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9kgdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.837760 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c5784d2-a874-4956-9d09-e923ac324925\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc09ceeab9b4ad70d85d56c420dcf8e28874bad5ed78b374fc1924db422b8c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:19Z\\\",\\\"message\\\":\\\"+0000 UTC m=+1.639324419): skip\\\\nI1001 11:29:19.554129 6355 services_controller.go:445] Built service openshift-cluster-version/cluster-version-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 11:29:19.554279 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1001 11:29:19.554303 6355 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1001 11:29:19.553823 6355 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-z8kl5 after 0 failed attempt(s)\\\\nF1001 11:29:19.554306 6355 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T11:29:48Z\\\",\\\"message\\\":\\\"al\\\\nI1001 11:29:48.091051 6709 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 11:29:48.091059 6709 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 11:29:48.091052 6709 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 11:29:48.091117 6709 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 11:29:48.091209 6709 factory.go:656] Stopping watch factory\\\\nI1001 11:29:48.091473 6709 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:48.091903 6709 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 11:29:48.091926 6709 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 11:29:48.091936 6709 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 11:29:48.091949 6709 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 11:29:48.092002 6709 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 11:29:48.092027 6709 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:29:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45sfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8kl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.853835 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d5916e4-204b-444a-ba00-8d65877d42f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e7d6d37d2b77e54ae36552a854cf14a01cc034d5b4412a284d82d89309f47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://defb73d1700cb2dcc31cadc6f8f57fc9aa3359a8eec96659695ada9c48175b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://671856fb498717b855a721a871f0e5c8f511eaadfc0f9f6fa2ec7abaf9c745c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c963f2f711248926193d97c782d1f75aef0604b1e11f7c9443882a6f5827ca9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.886795 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96aca807-1e36-414e-8ec8-52cadfd417a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fb0dc6993f0fec463ffe6fec7fe1b31c5d6c109c8adfc0bc22f3f2123c1242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b658b5491f89bf4aac23f2faeb61d60f66b5c5f0c75e184e7a138d93c8833f2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f606ea18d80c0366caff03caeaf54e2e9a2688274434c390cf71698c1b478e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b06311e061319697f29e496428ca3d08b9b27f8a9d5df3e1a98e54ee1b2e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a0397e87c55f3205e7fc6f34f02be0bae9f709b1e7fe379509eb2d1a16dd1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9f0fc2b7e2a86e36c72c529a106046bdf3f8d48bf511b84698ae8cbab0f4f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8516bcd17d5041c9b33c80ac0db7511a9ceb972b9f72a167c5a77cf0ff314cc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d8d4b8a8fcee623b36de9bd914b4d9ce43e502d3380df9f61525ad16ce22cc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.901201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.901284 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.901308 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.901344 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.901367 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:29:59Z","lastTransitionTime":"2025-10-01T11:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.909942 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.929356 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.943853 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.961162 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.982115 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:29:59 crc kubenswrapper[4669]: I1001 11:29:59.999186 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:29:59Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.004756 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.004809 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.004829 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.004865 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.004889 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.019257 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:30:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.107770 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.107867 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.107885 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.107912 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.107933 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.211250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.211324 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.211344 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.211370 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.211390 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.314265 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.314330 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.314349 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.314376 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.314395 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.417458 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.417539 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.417587 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.417619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.417643 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.520501 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.520550 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.520565 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.520588 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.520606 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.624448 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.624508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.624530 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.624559 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.624583 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.643139 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.643296 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:00 crc kubenswrapper[4669]: E1001 11:30:00.643417 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.643500 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:00 crc kubenswrapper[4669]: E1001 11:30:00.643699 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:00 crc kubenswrapper[4669]: E1001 11:30:00.644208 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.645267 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:30:00 crc kubenswrapper[4669]: E1001 11:30:00.645477 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.666718 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:30:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.685756 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30ba513f-67c5-4e4f-b8a7-be9c67660bec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgm6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:29:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wvnw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:30:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.703221 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6b93a6-fa53-42b0-8563-6ea4123a0cd7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb4b78de0070f57401ab73d5f6e47d62a5752c1db89a5a25d2ba05592c6878a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15bbac1ae30396ae6308715bed291356384e32beecf27a0464939b01a6c6d44c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db32ff64b8dd7190a15a38b7c3de701a2381b513d437e4577e3ddb54fa614c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d63489b6e2eae6705a43efda50da635b1a427f1bde68745bdc354491ff7981f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:30:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.718306 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa21b94e-6c66-4942-8528-a1ef2fa7c12f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b213f41259bd82ff2c26ee698af09bb726c0927b06df4ff9b303ed593ff7aa35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5199989eaade1e93c85fdc07f7091ce48f807f93e7dc727a977d5587bea526bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58e68f7c6d406e4fa7a7f0d6981b06709a75ace2f58f0e333dccb15f6435f880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c37e4f59a1d366ea194f9b9a4e772c129cecbee9a7e732f37888760117436988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79daa6187cd5b0a8628411215dc0112ed4c33134ace63b0c28fdede46be5acc4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T11:28:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 11:28:49.135564 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 11:28:49.135730 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 11:28:49.136506 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029746450/tls.crt::/tmp/serving-cert-4029746450/tls.key\\\\\\\"\\\\nI1001 11:28:49.537762 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 11:28:49.554345 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 11:28:49.554382 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 11:28:49.554418 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 11:28:49.554426 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 11:28:49.566668 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 11:28:49.566715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566723 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 11:28:49.566730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 11:28:49.566734 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 11:28:49.566737 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 11:28:49.566742 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 11:28:49.566739 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 11:28:49.570868 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4a0e3bbfbbac7315ef989b890ab0d18e62caf12c7d07e1163a97d6e12c90c5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:32Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592717e4538680ece4a0863b2592375af67008d2bf3aa87636872671f2518a27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T11:28:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T11:28:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T11:28:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:30:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.727958 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.728019 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.728039 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.728061 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.728096 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.731794 4669 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T11:28:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b221caa1da8810ee71e5266e27498a14cff6fc48c02f888022e1bddab4e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T11:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T11:30:00Z is after 2025-08-24T17:21:41Z" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.826996 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknp5" podStartSLOduration=70.826976781 podStartE2EDuration="1m10.826976781s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.826404307 +0000 UTC m=+91.925969284" watchObservedRunningTime="2025-10-01 11:30:00.826976781 +0000 UTC m=+91.926541758" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.827125 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bf8lj" podStartSLOduration=70.827120534 podStartE2EDuration="1m10.827120534s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.811123102 +0000 UTC m=+91.910688079" watchObservedRunningTime="2025-10-01 11:30:00.827120534 +0000 UTC m=+91.926685511" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.832822 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.832855 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.832891 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.832904 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.832913 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.865218 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zmmr7" podStartSLOduration=70.865201029 podStartE2EDuration="1m10.865201029s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.864927012 +0000 UTC m=+91.964491989" watchObservedRunningTime="2025-10-01 11:30:00.865201029 +0000 UTC m=+91.964766006" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.865384 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fsthv" podStartSLOduration=70.865380063 podStartE2EDuration="1m10.865380063s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.85383601 +0000 UTC m=+91.953400987" watchObservedRunningTime="2025-10-01 11:30:00.865380063 +0000 UTC m=+91.964945030" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.889621 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podStartSLOduration=70.889596979 podStartE2EDuration="1m10.889596979s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.87702589 +0000 UTC m=+91.976590867" watchObservedRunningTime="2025-10-01 11:30:00.889596979 +0000 UTC m=+91.989161956" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.918162 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9kgdm" podStartSLOduration=70.918137159 podStartE2EDuration="1m10.918137159s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.890378737 +0000 UTC m=+91.989943704" watchObservedRunningTime="2025-10-01 11:30:00.918137159 +0000 UTC m=+92.017702136" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.935355 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.935398 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.935409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.935426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.935449 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:00Z","lastTransitionTime":"2025-10-01T11:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.959128 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.959063734 podStartE2EDuration="1m11.959063734s" podCreationTimestamp="2025-10-01 11:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.936942341 +0000 UTC m=+92.036507318" watchObservedRunningTime="2025-10-01 11:30:00.959063734 +0000 UTC m=+92.058628711" Oct 01 11:30:00 crc kubenswrapper[4669]: I1001 11:30:00.959507 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=69.959501785 podStartE2EDuration="1m9.959501785s" podCreationTimestamp="2025-10-01 11:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:00.958297486 +0000 UTC m=+92.057862473" watchObservedRunningTime="2025-10-01 11:30:00.959501785 +0000 UTC m=+92.059066762" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.039138 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.039201 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.039223 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.039250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.039273 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.142596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.142651 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.142660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.142695 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.142707 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.246369 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.246405 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.246415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.246432 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.246442 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.348737 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.348802 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.348819 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.348843 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.348862 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.452529 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.452581 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.452590 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.452605 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.452617 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.555454 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.555521 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.555539 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.555567 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.555587 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.643875 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:01 crc kubenswrapper[4669]: E1001 11:30:01.644167 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.658882 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.658947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.658968 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.658996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.659020 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.762876 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.762933 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.762959 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.762989 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.763009 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.866877 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.866966 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.866993 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.867028 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.867052 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.970663 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.970725 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.970739 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.970763 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:01 crc kubenswrapper[4669]: I1001 11:30:01.970779 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:01Z","lastTransitionTime":"2025-10-01T11:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.074640 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.074707 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.074731 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.074756 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.074777 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.178005 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.178127 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.178148 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.178182 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.178256 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.282455 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.282537 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.282561 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.282593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.282620 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.385685 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.385837 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.385857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.385887 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.385909 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.488609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.488642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.488651 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.488665 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.488674 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.590830 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.590872 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.590882 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.590900 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.590911 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.644186 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.644257 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.644264 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:02 crc kubenswrapper[4669]: E1001 11:30:02.644492 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:02 crc kubenswrapper[4669]: E1001 11:30:02.644615 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:02 crc kubenswrapper[4669]: E1001 11:30:02.644752 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.694278 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.694365 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.694383 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.694409 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.694428 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.797482 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.797754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.797779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.797807 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.797832 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.901660 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.901734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.901753 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.901779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:02 crc kubenswrapper[4669]: I1001 11:30:02.901798 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:02Z","lastTransitionTime":"2025-10-01T11:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.005586 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.005659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.005682 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.005709 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.005727 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.108989 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.109111 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.109139 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.109172 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.109199 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.212642 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.212712 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.212730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.212753 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.212767 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.316005 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.316114 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.316135 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.316163 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.316184 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.421226 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.421804 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.421827 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.421858 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.421877 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.525211 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.525272 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.525282 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.525306 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.525321 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.628442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.628519 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.628533 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.628552 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.628574 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.644048 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:03 crc kubenswrapper[4669]: E1001 11:30:03.644217 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.662398 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.732231 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.732277 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.732290 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.732313 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.732328 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.835541 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.835602 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.835619 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.835641 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.835655 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.939416 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.939508 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.939534 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.939570 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:03 crc kubenswrapper[4669]: I1001 11:30:03.939594 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:03Z","lastTransitionTime":"2025-10-01T11:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.042792 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.042857 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.042877 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.042903 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.042922 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.146479 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.146593 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.146614 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.146640 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.146661 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.249582 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.249643 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.249659 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.249687 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.249707 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.356894 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.356934 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.356947 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.356974 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.356991 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.460287 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.460362 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.460380 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.460408 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.460428 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.563505 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.563546 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.563557 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.563573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.563584 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.643865 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.643943 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.643977 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:04 crc kubenswrapper[4669]: E1001 11:30:04.644186 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:04 crc kubenswrapper[4669]: E1001 11:30:04.644359 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:04 crc kubenswrapper[4669]: E1001 11:30:04.644491 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.666531 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.666574 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.666603 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.666623 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.666641 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.769597 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.769656 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.769673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.769697 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.769714 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.872723 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.872791 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.872813 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.872839 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.872861 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.975226 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.975293 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.975304 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.975324 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:04 crc kubenswrapper[4669]: I1001 11:30:04.975335 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:04Z","lastTransitionTime":"2025-10-01T11:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.078219 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.078315 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.078335 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.078363 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.078383 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.181679 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.181754 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.181771 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.181803 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.181820 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.285637 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.285703 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.285721 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.285747 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.285769 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.388652 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.388730 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.388752 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.388779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.388799 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.492840 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.492926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.492946 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.493013 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.493034 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.596630 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.596690 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.596710 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.596744 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.596770 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.643751 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:05 crc kubenswrapper[4669]: E1001 11:30:05.643969 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.699722 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.699926 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.699967 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.699996 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.700055 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.803199 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.803273 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.803299 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.803331 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.803355 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.907250 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.907311 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.907329 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.907354 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:05 crc kubenswrapper[4669]: I1001 11:30:05.907380 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:05Z","lastTransitionTime":"2025-10-01T11:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.011166 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.011238 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.011254 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.011279 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.011299 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.114746 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.114826 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.114849 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.114882 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.114907 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.218055 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.218129 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.218144 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.218160 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.218173 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.322528 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.322573 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.322589 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.322609 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.322622 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.428535 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.428709 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.428734 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.428764 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.428782 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.532940 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.533359 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.533516 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.533678 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.533804 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.637522 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.637596 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.637617 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.637649 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.637673 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.643895 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.643918 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:06 crc kubenswrapper[4669]: E1001 11:30:06.644070 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.644127 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:06 crc kubenswrapper[4669]: E1001 11:30:06.644579 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:06 crc kubenswrapper[4669]: E1001 11:30:06.644675 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.741527 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.741594 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.741613 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.741638 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.741658 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.847244 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.847318 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.847340 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.847367 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.847389 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.950480 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.950779 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.950924 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.951128 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:06 crc kubenswrapper[4669]: I1001 11:30:06.951274 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:06Z","lastTransitionTime":"2025-10-01T11:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.055294 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.055368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.055386 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.055414 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.055438 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.159117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.159567 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.159740 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.159870 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.160001 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.263036 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.263092 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.263102 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.263117 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.263126 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.367368 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.367415 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.367426 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.367442 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.367456 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.470797 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.470859 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.470907 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.470961 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.470978 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.574417 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.574525 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.574555 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.574592 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.574618 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.585538 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.585673 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.585696 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.585723 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.585747 4669 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T11:30:07Z","lastTransitionTime":"2025-10-01T11:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.644208 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:07 crc kubenswrapper[4669]: E1001 11:30:07.644690 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.654680 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv"] Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.655306 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.658757 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.658862 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.659068 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.659194 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.681943 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.681923096 podStartE2EDuration="1m18.681923096s" podCreationTimestamp="2025-10-01 11:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:07.681773892 +0000 UTC m=+98.781338929" watchObservedRunningTime="2025-10-01 11:30:07.681923096 +0000 UTC m=+98.781488073" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.730855 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c8217e-f06e-4133-a450-34759038d9a8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.730948 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/66c8217e-f06e-4133-a450-34759038d9a8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.731021 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66c8217e-f06e-4133-a450-34759038d9a8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.731056 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/66c8217e-f06e-4133-a450-34759038d9a8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.731149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c8217e-f06e-4133-a450-34759038d9a8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.775269 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.775208596 podStartE2EDuration="45.775208596s" podCreationTimestamp="2025-10-01 11:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:07.764319689 +0000 UTC m=+98.863884666" watchObservedRunningTime="2025-10-01 11:30:07.775208596 +0000 UTC m=+98.874773573" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.790117 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.790097962 podStartE2EDuration="4.790097962s" podCreationTimestamp="2025-10-01 11:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:07.775857432 +0000 UTC m=+98.875422409" watchObservedRunningTime="2025-10-01 11:30:07.790097962 +0000 UTC m=+98.889662939" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.832920 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c8217e-f06e-4133-a450-34759038d9a8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.833020 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c8217e-f06e-4133-a450-34759038d9a8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.833060 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/66c8217e-f06e-4133-a450-34759038d9a8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.833174 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66c8217e-f06e-4133-a450-34759038d9a8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.833210 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/66c8217e-f06e-4133-a450-34759038d9a8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.833307 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/66c8217e-f06e-4133-a450-34759038d9a8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.833936 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/66c8217e-f06e-4133-a450-34759038d9a8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.835145 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66c8217e-f06e-4133-a450-34759038d9a8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.842055 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c8217e-f06e-4133-a450-34759038d9a8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.864124 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66c8217e-f06e-4133-a450-34759038d9a8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-htfrv\" (UID: \"66c8217e-f06e-4133-a450-34759038d9a8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:07 crc kubenswrapper[4669]: I1001 11:30:07.977635 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" Oct 01 11:30:08 crc kubenswrapper[4669]: W1001 11:30:08.003965 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c8217e_f06e_4133_a450_34759038d9a8.slice/crio-338e662cd71a42eefa09758182b3fda1a51d9c8a73a0930896f49f8893355413 WatchSource:0}: Error finding container 338e662cd71a42eefa09758182b3fda1a51d9c8a73a0930896f49f8893355413: Status 404 returned error can't find the container with id 338e662cd71a42eefa09758182b3fda1a51d9c8a73a0930896f49f8893355413 Oct 01 11:30:08 crc kubenswrapper[4669]: I1001 11:30:08.372326 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" event={"ID":"66c8217e-f06e-4133-a450-34759038d9a8","Type":"ContainerStarted","Data":"338e662cd71a42eefa09758182b3fda1a51d9c8a73a0930896f49f8893355413"} Oct 01 11:30:08 crc kubenswrapper[4669]: I1001 11:30:08.643723 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:08 crc kubenswrapper[4669]: I1001 11:30:08.643751 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:08 crc kubenswrapper[4669]: E1001 11:30:08.645457 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:08 crc kubenswrapper[4669]: I1001 11:30:08.643908 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:08 crc kubenswrapper[4669]: E1001 11:30:08.645618 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:08 crc kubenswrapper[4669]: I1001 11:30:08.645654 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:08 crc kubenswrapper[4669]: E1001 11:30:08.645805 4669 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:30:08 crc kubenswrapper[4669]: E1001 11:30:08.645843 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:08 crc kubenswrapper[4669]: E1001 11:30:08.645876 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs podName:30ba513f-67c5-4e4f-b8a7-be9c67660bec nodeName:}" failed. No retries permitted until 2025-10-01 11:31:12.645845955 +0000 UTC m=+163.745410972 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs") pod "network-metrics-daemon-wvnw6" (UID: "30ba513f-67c5-4e4f-b8a7-be9c67660bec") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 11:30:09 crc kubenswrapper[4669]: I1001 11:30:09.386194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" event={"ID":"66c8217e-f06e-4133-a450-34759038d9a8","Type":"ContainerStarted","Data":"60ea5fb9ab98029a1cae2e481f4d190289b5db5a6b94d6a39f1d67dbe0dd3b65"} Oct 01 11:30:09 crc kubenswrapper[4669]: I1001 11:30:09.644013 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:09 crc kubenswrapper[4669]: E1001 11:30:09.645885 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:10 crc kubenswrapper[4669]: I1001 11:30:10.643500 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:10 crc kubenswrapper[4669]: I1001 11:30:10.643732 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:10 crc kubenswrapper[4669]: I1001 11:30:10.643777 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:10 crc kubenswrapper[4669]: E1001 11:30:10.643905 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:10 crc kubenswrapper[4669]: E1001 11:30:10.644382 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:10 crc kubenswrapper[4669]: E1001 11:30:10.644555 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:11 crc kubenswrapper[4669]: I1001 11:30:11.643309 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:11 crc kubenswrapper[4669]: E1001 11:30:11.643475 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:11 crc kubenswrapper[4669]: I1001 11:30:11.644835 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:30:11 crc kubenswrapper[4669]: E1001 11:30:11.645129 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:30:12 crc kubenswrapper[4669]: I1001 11:30:12.643921 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:12 crc kubenswrapper[4669]: I1001 11:30:12.643973 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:12 crc kubenswrapper[4669]: I1001 11:30:12.643933 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:12 crc kubenswrapper[4669]: E1001 11:30:12.644195 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:12 crc kubenswrapper[4669]: E1001 11:30:12.644465 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:12 crc kubenswrapper[4669]: E1001 11:30:12.644643 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:13 crc kubenswrapper[4669]: I1001 11:30:13.643306 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:13 crc kubenswrapper[4669]: E1001 11:30:13.643532 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:14 crc kubenswrapper[4669]: I1001 11:30:14.643716 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:14 crc kubenswrapper[4669]: I1001 11:30:14.643778 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:14 crc kubenswrapper[4669]: E1001 11:30:14.644226 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:14 crc kubenswrapper[4669]: E1001 11:30:14.644376 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:14 crc kubenswrapper[4669]: I1001 11:30:14.644434 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:14 crc kubenswrapper[4669]: E1001 11:30:14.644672 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:15 crc kubenswrapper[4669]: I1001 11:30:15.643498 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:15 crc kubenswrapper[4669]: E1001 11:30:15.643693 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:16 crc kubenswrapper[4669]: I1001 11:30:16.643757 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:16 crc kubenswrapper[4669]: I1001 11:30:16.644032 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:16 crc kubenswrapper[4669]: E1001 11:30:16.644184 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:16 crc kubenswrapper[4669]: I1001 11:30:16.644041 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:16 crc kubenswrapper[4669]: E1001 11:30:16.644485 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:16 crc kubenswrapper[4669]: E1001 11:30:16.644393 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:17 crc kubenswrapper[4669]: I1001 11:30:17.644036 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:17 crc kubenswrapper[4669]: E1001 11:30:17.644336 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:18 crc kubenswrapper[4669]: I1001 11:30:18.643998 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:18 crc kubenswrapper[4669]: I1001 11:30:18.644156 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:18 crc kubenswrapper[4669]: I1001 11:30:18.643999 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:18 crc kubenswrapper[4669]: E1001 11:30:18.644268 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:18 crc kubenswrapper[4669]: E1001 11:30:18.644375 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:18 crc kubenswrapper[4669]: E1001 11:30:18.644493 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:19 crc kubenswrapper[4669]: I1001 11:30:19.643792 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:19 crc kubenswrapper[4669]: E1001 11:30:19.646209 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:20 crc kubenswrapper[4669]: I1001 11:30:20.644073 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:20 crc kubenswrapper[4669]: I1001 11:30:20.644108 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:20 crc kubenswrapper[4669]: I1001 11:30:20.644268 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:20 crc kubenswrapper[4669]: E1001 11:30:20.644703 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:20 crc kubenswrapper[4669]: E1001 11:30:20.644871 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:20 crc kubenswrapper[4669]: E1001 11:30:20.645214 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:21 crc kubenswrapper[4669]: I1001 11:30:21.643869 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:21 crc kubenswrapper[4669]: E1001 11:30:21.644044 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:22 crc kubenswrapper[4669]: I1001 11:30:22.643710 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:22 crc kubenswrapper[4669]: E1001 11:30:22.643900 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:22 crc kubenswrapper[4669]: I1001 11:30:22.643724 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:22 crc kubenswrapper[4669]: E1001 11:30:22.644042 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:22 crc kubenswrapper[4669]: I1001 11:30:22.644457 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:22 crc kubenswrapper[4669]: E1001 11:30:22.644601 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:23 crc kubenswrapper[4669]: I1001 11:30:23.644011 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:23 crc kubenswrapper[4669]: E1001 11:30:23.644237 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:24 crc kubenswrapper[4669]: I1001 11:30:24.644211 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:24 crc kubenswrapper[4669]: I1001 11:30:24.644328 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:24 crc kubenswrapper[4669]: E1001 11:30:24.644687 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:24 crc kubenswrapper[4669]: E1001 11:30:24.644792 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:24 crc kubenswrapper[4669]: I1001 11:30:24.644886 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:24 crc kubenswrapper[4669]: E1001 11:30:24.645029 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:25 crc kubenswrapper[4669]: I1001 11:30:25.643873 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:25 crc kubenswrapper[4669]: E1001 11:30:25.644910 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:25 crc kubenswrapper[4669]: I1001 11:30:25.645485 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:30:25 crc kubenswrapper[4669]: E1001 11:30:25.646440 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8kl5_openshift-ovn-kubernetes(6c5784d2-a874-4956-9d09-e923ac324925)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.453278 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/1.log" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.454012 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/0.log" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.454071 4669 generic.go:334] "Generic (PLEG): container finished" podID="238b8e33-ca8b-419a-b038-329ab97a3843" containerID="7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795" exitCode=1 Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.454160 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerDied","Data":"7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795"} Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.454221 4669 scope.go:117] "RemoveContainer" containerID="7f2500e783fb73ebf0afdb6c62a5d3be3fb2d0f1c3a2fa8088013fb8d5a76471" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.454616 4669 scope.go:117] "RemoveContainer" containerID="7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795" Oct 01 11:30:26 crc kubenswrapper[4669]: E1001 11:30:26.454778 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9kgdm_openshift-multus(238b8e33-ca8b-419a-b038-329ab97a3843)\"" pod="openshift-multus/multus-9kgdm" podUID="238b8e33-ca8b-419a-b038-329ab97a3843" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.482779 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-htfrv" podStartSLOduration=96.482689694 podStartE2EDuration="1m36.482689694s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:09.410110332 +0000 UTC m=+100.509675339" watchObservedRunningTime="2025-10-01 11:30:26.482689694 +0000 UTC m=+117.582254691" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.643394 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.643528 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:26 crc kubenswrapper[4669]: I1001 11:30:26.643581 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:26 crc kubenswrapper[4669]: E1001 11:30:26.643781 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:26 crc kubenswrapper[4669]: E1001 11:30:26.643946 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:26 crc kubenswrapper[4669]: E1001 11:30:26.644179 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:27 crc kubenswrapper[4669]: I1001 11:30:27.460658 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/1.log" Oct 01 11:30:27 crc kubenswrapper[4669]: I1001 11:30:27.643685 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:27 crc kubenswrapper[4669]: E1001 11:30:27.643876 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:28 crc kubenswrapper[4669]: I1001 11:30:28.643376 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:28 crc kubenswrapper[4669]: I1001 11:30:28.643476 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:28 crc kubenswrapper[4669]: I1001 11:30:28.643379 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:28 crc kubenswrapper[4669]: E1001 11:30:28.643558 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:28 crc kubenswrapper[4669]: E1001 11:30:28.643756 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:28 crc kubenswrapper[4669]: E1001 11:30:28.643876 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:29 crc kubenswrapper[4669]: E1001 11:30:29.643415 4669 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 11:30:29 crc kubenswrapper[4669]: I1001 11:30:29.643530 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:29 crc kubenswrapper[4669]: E1001 11:30:29.646153 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:29 crc kubenswrapper[4669]: E1001 11:30:29.757336 4669 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 11:30:30 crc kubenswrapper[4669]: I1001 11:30:30.644027 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:30 crc kubenswrapper[4669]: I1001 11:30:30.644174 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:30 crc kubenswrapper[4669]: I1001 11:30:30.644027 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:30 crc kubenswrapper[4669]: E1001 11:30:30.644281 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:30 crc kubenswrapper[4669]: E1001 11:30:30.644395 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:30 crc kubenswrapper[4669]: E1001 11:30:30.644553 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:31 crc kubenswrapper[4669]: I1001 11:30:31.643233 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:31 crc kubenswrapper[4669]: E1001 11:30:31.643438 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:32 crc kubenswrapper[4669]: I1001 11:30:32.643623 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:32 crc kubenswrapper[4669]: I1001 11:30:32.643765 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:32 crc kubenswrapper[4669]: I1001 11:30:32.643643 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:32 crc kubenswrapper[4669]: E1001 11:30:32.643852 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:32 crc kubenswrapper[4669]: E1001 11:30:32.643967 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:32 crc kubenswrapper[4669]: E1001 11:30:32.644120 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:33 crc kubenswrapper[4669]: I1001 11:30:33.643668 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:33 crc kubenswrapper[4669]: E1001 11:30:33.643875 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:34 crc kubenswrapper[4669]: I1001 11:30:34.643165 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:34 crc kubenswrapper[4669]: E1001 11:30:34.643438 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:34 crc kubenswrapper[4669]: I1001 11:30:34.643195 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:34 crc kubenswrapper[4669]: E1001 11:30:34.643636 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:34 crc kubenswrapper[4669]: I1001 11:30:34.643171 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:34 crc kubenswrapper[4669]: E1001 11:30:34.643802 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:34 crc kubenswrapper[4669]: E1001 11:30:34.759416 4669 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 11:30:35 crc kubenswrapper[4669]: I1001 11:30:35.644194 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:35 crc kubenswrapper[4669]: E1001 11:30:35.644446 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:36 crc kubenswrapper[4669]: I1001 11:30:36.643626 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:36 crc kubenswrapper[4669]: E1001 11:30:36.643882 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:36 crc kubenswrapper[4669]: I1001 11:30:36.644705 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:36 crc kubenswrapper[4669]: E1001 11:30:36.644967 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:36 crc kubenswrapper[4669]: I1001 11:30:36.644159 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:36 crc kubenswrapper[4669]: E1001 11:30:36.645308 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:37 crc kubenswrapper[4669]: I1001 11:30:37.643704 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:37 crc kubenswrapper[4669]: E1001 11:30:37.644291 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:38 crc kubenswrapper[4669]: I1001 11:30:38.643716 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:38 crc kubenswrapper[4669]: I1001 11:30:38.643779 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:38 crc kubenswrapper[4669]: I1001 11:30:38.643720 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:38 crc kubenswrapper[4669]: E1001 11:30:38.643930 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:38 crc kubenswrapper[4669]: E1001 11:30:38.644044 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:38 crc kubenswrapper[4669]: E1001 11:30:38.644190 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:39 crc kubenswrapper[4669]: I1001 11:30:39.644270 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:39 crc kubenswrapper[4669]: E1001 11:30:39.645955 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:39 crc kubenswrapper[4669]: E1001 11:30:39.760742 4669 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 11:30:40 crc kubenswrapper[4669]: I1001 11:30:40.643538 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:40 crc kubenswrapper[4669]: I1001 11:30:40.643862 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:40 crc kubenswrapper[4669]: I1001 11:30:40.643998 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:40 crc kubenswrapper[4669]: I1001 11:30:40.644204 4669 scope.go:117] "RemoveContainer" containerID="7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795" Oct 01 11:30:40 crc kubenswrapper[4669]: E1001 11:30:40.644309 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:40 crc kubenswrapper[4669]: E1001 11:30:40.644477 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:40 crc kubenswrapper[4669]: E1001 11:30:40.644559 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:40 crc kubenswrapper[4669]: I1001 11:30:40.644649 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.516961 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/3.log" Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.520067 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerStarted","Data":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.520533 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.522400 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/1.log" Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.522459 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerStarted","Data":"2b92a3e428a83d8c9dfc08f32f569a5b6ad6841717ca649606e0ea74c98b3996"} Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.563509 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podStartSLOduration=111.563481861 podStartE2EDuration="1m51.563481861s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:41.558797225 +0000 UTC m=+132.658362202" watchObservedRunningTime="2025-10-01 11:30:41.563481861 +0000 UTC m=+132.663046878" Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.643763 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:41 crc kubenswrapper[4669]: E1001 11:30:41.643972 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.786931 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wvnw6"] Oct 01 11:30:41 crc kubenswrapper[4669]: I1001 11:30:41.787182 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:41 crc kubenswrapper[4669]: E1001 11:30:41.787300 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:42 crc kubenswrapper[4669]: I1001 11:30:42.643432 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:42 crc kubenswrapper[4669]: I1001 11:30:42.643851 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:42 crc kubenswrapper[4669]: E1001 11:30:42.644063 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:42 crc kubenswrapper[4669]: E1001 11:30:42.644250 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:43 crc kubenswrapper[4669]: I1001 11:30:43.643829 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:43 crc kubenswrapper[4669]: I1001 11:30:43.643904 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:43 crc kubenswrapper[4669]: E1001 11:30:43.644036 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 11:30:43 crc kubenswrapper[4669]: E1001 11:30:43.644209 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvnw6" podUID="30ba513f-67c5-4e4f-b8a7-be9c67660bec" Oct 01 11:30:44 crc kubenswrapper[4669]: I1001 11:30:44.643951 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:44 crc kubenswrapper[4669]: I1001 11:30:44.643975 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:44 crc kubenswrapper[4669]: E1001 11:30:44.644268 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 11:30:44 crc kubenswrapper[4669]: E1001 11:30:44.644449 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 11:30:45 crc kubenswrapper[4669]: I1001 11:30:45.643977 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:30:45 crc kubenswrapper[4669]: I1001 11:30:45.644871 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:45 crc kubenswrapper[4669]: I1001 11:30:45.648205 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 11:30:45 crc kubenswrapper[4669]: I1001 11:30:45.648221 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 11:30:45 crc kubenswrapper[4669]: I1001 11:30:45.651308 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 11:30:45 crc kubenswrapper[4669]: I1001 11:30:45.651770 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 11:30:46 crc kubenswrapper[4669]: I1001 11:30:46.643824 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:46 crc kubenswrapper[4669]: I1001 11:30:46.643963 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:46 crc kubenswrapper[4669]: I1001 11:30:46.647464 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 11:30:46 crc kubenswrapper[4669]: I1001 11:30:46.647705 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.719866 4669 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.772797 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sbrcs"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.773299 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.775361 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flpxd"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.775916 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.777711 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7q7n5"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.778349 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.780338 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.780769 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.781553 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.781602 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.781717 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.782144 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.781758 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.781802 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.784245 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-cclkd"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.785132 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.786060 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.786271 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.787234 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.797614 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.798124 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.799418 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wn6cw"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.800166 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.800992 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.801473 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.812153 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.812608 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.812652 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.812797 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.812920 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.813212 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.813511 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.813846 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.814343 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.814876 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qqhrp"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.815478 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.825609 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.826038 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hdrzq"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.826499 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.826062 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.827201 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.827216 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.827338 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.828731 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6vbrp"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.828867 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.829257 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.829598 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.830057 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838619 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838659 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdf5\" (UniqueName: \"kubernetes.io/projected/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-kube-api-access-hqdf5\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838681 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-audit-policies\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838700 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7445657-b8e4-4974-a680-7a05f0628fb7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838733 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-config\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838750 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838781 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838796 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-encryption-config\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838816 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-etcd-client\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838830 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-etcd-client\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838846 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-node-pullsecrets\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838865 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqs2k\" (UniqueName: \"kubernetes.io/projected/8f196659-a904-4e87-a32c-cae07c3911ea-kube-api-access-tqs2k\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838881 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glcpd\" (UniqueName: \"kubernetes.io/projected/e7445657-b8e4-4974-a680-7a05f0628fb7-kube-api-access-glcpd\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838901 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-image-import-ca\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838918 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-audit-dir\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838934 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75bcc3da-1b36-4ee1-860e-787d82ea77e2-serving-cert\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838962 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-config\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838977 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-audit\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.838995 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f196659-a904-4e87-a32c-cae07c3911ea-audit-dir\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839028 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4l7b\" (UniqueName: \"kubernetes.io/projected/75bcc3da-1b36-4ee1-860e-787d82ea77e2-kube-api-access-t4l7b\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839047 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7445657-b8e4-4974-a680-7a05f0628fb7-config\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839064 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-client-ca\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839100 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839117 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839136 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-encryption-config\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839153 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7445657-b8e4-4974-a680-7a05f0628fb7-images\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839171 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-serving-cert\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839185 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-serving-cert\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839643 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839813 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.839934 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840021 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840396 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840458 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840540 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840686 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840826 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840985 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840996 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840701 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.841171 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.841483 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.841553 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.841602 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.841631 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.840701 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.843213 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8hc7m"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.843504 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.843946 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.845216 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.845319 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.845406 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.848688 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.850143 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x95jd"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.850689 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.851235 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s7dqt"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.851762 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.851857 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.852322 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.852510 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.854217 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.855550 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.855690 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.855767 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.855875 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.855986 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.855822 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.856125 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.856407 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.856799 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.862329 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.863094 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.863580 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.864559 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.864777 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.867065 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869378 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869518 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869385 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869710 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869851 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869901 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.869972 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.870005 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.870127 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.870135 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.870501 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.874247 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.875567 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.875658 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.876378 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.876540 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.879285 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.880186 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.889737 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7q7n5"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.890587 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.927947 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928019 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928197 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928286 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928350 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928384 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928518 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928614 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928355 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.928833 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.929422 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.929639 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.929963 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.930179 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.930998 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.933202 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.933922 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.934103 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.934200 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.934895 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gmqg9"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.935478 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.936035 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flpxd"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.937989 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.938330 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.938799 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.939182 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sbrcs"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940311 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-config\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940358 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940400 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940424 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-encryption-config\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940444 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-etcd-client\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940467 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-etcd-client\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940488 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-node-pullsecrets\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940515 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqs2k\" (UniqueName: \"kubernetes.io/projected/8f196659-a904-4e87-a32c-cae07c3911ea-kube-api-access-tqs2k\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940541 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glcpd\" (UniqueName: \"kubernetes.io/projected/e7445657-b8e4-4974-a680-7a05f0628fb7-kube-api-access-glcpd\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940566 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-image-import-ca\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940591 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75bcc3da-1b36-4ee1-860e-787d82ea77e2-serving-cert\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940609 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-audit-dir\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940630 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-audit\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940658 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-config\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940683 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f196659-a904-4e87-a32c-cae07c3911ea-audit-dir\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940704 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4l7b\" (UniqueName: \"kubernetes.io/projected/75bcc3da-1b36-4ee1-860e-787d82ea77e2-kube-api-access-t4l7b\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940725 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7445657-b8e4-4974-a680-7a05f0628fb7-config\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940745 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-client-ca\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940770 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940792 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-encryption-config\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940839 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7445657-b8e4-4974-a680-7a05f0628fb7-images\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940863 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-serving-cert\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940883 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-serving-cert\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940913 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940939 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdf5\" (UniqueName: \"kubernetes.io/projected/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-kube-api-access-hqdf5\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940965 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-audit-policies\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.940990 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7445657-b8e4-4974-a680-7a05f0628fb7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.941241 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-node-pullsecrets\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.941844 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-config\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.942313 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.942590 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-config\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.943393 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.943815 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.944861 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.945327 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.945572 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.945602 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-audit-dir\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.946647 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.948206 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.948802 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.948814 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-image-import-ca\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.948958 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.949209 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.949594 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.950036 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.950612 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7445657-b8e4-4974-a680-7a05f0628fb7-images\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.950760 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-audit-policies\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.951531 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f196659-a904-4e87-a32c-cae07c3911ea-audit-dir\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.952392 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7445657-b8e4-4974-a680-7a05f0628fb7-config\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.952614 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f196659-a904-4e87-a32c-cae07c3911ea-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.953168 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-client-ca\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.953904 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.953928 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-etcd-client\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.954173 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-serving-cert\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.955126 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.957454 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-encryption-config\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.957583 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7445657-b8e4-4974-a680-7a05f0628fb7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.958230 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75bcc3da-1b36-4ee1-860e-787d82ea77e2-serving-cert\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.959511 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.962720 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f196659-a904-4e87-a32c-cae07c3911ea-encryption-config\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.964896 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-audit\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.965052 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-etcd-client\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.965297 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.970784 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-serving-cert\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.973411 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.973732 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.978825 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.979347 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.980972 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.981039 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.981475 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.981864 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.982491 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.984008 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.984597 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.995776 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.996776 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6"] Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.997432 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:48 crc kubenswrapper[4669]: I1001 11:30:48.997881 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:48.998853 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:48.999234 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.001963 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.002447 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.002909 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.003199 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zxlg8"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.005799 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.006295 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.006470 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.007204 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dcmjv"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.007253 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.008007 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.009013 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q7cnf"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.009764 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.010871 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wn6cw"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.013710 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-44vxs"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.014470 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.015961 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.016151 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.017058 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.017991 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.018979 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.019833 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hdrzq"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.021266 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.022809 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8hc7m"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.024041 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cclkd"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.025383 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6vbrp"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.026640 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.029803 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.031803 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x95jd"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.032976 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.034627 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.035472 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.036462 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qqhrp"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.037909 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.038906 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q7cnf"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.040166 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.041289 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.042336 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.043584 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.045643 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.048190 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q8qw4"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.049520 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.049627 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.050539 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x52td"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.051093 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.051826 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.053315 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.054308 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.055848 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.055886 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.057183 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.058514 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.060155 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s7dqt"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.062031 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.063531 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dcmjv"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.065484 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.066493 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.067736 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q8qw4"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.068802 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-44vxs"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.070090 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zxlg8"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.071280 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.072917 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5bjch"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.077979 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.082778 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5bjch"] Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.082904 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.095303 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.136540 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.164844 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.184513 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.198155 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.221565 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.255695 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.275361 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.296572 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.316095 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.335639 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.355184 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.382169 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.395792 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.416011 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.437406 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.457304 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.476314 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.479154 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.522386 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqs2k\" (UniqueName: \"kubernetes.io/projected/8f196659-a904-4e87-a32c-cae07c3911ea-kube-api-access-tqs2k\") pod \"apiserver-7bbb656c7d-n7wk4\" (UID: \"8f196659-a904-4e87-a32c-cae07c3911ea\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.542064 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glcpd\" (UniqueName: \"kubernetes.io/projected/e7445657-b8e4-4974-a680-7a05f0628fb7-kube-api-access-glcpd\") pod \"machine-api-operator-5694c8668f-7q7n5\" (UID: \"e7445657-b8e4-4974-a680-7a05f0628fb7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.566254 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdf5\" (UniqueName: \"kubernetes.io/projected/4c8e059d-1baa-4142-a6e9-af3c9bfe16d3-kube-api-access-hqdf5\") pod \"apiserver-76f77b778f-flpxd\" (UID: \"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3\") " pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.576570 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.579689 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4l7b\" (UniqueName: \"kubernetes.io/projected/75bcc3da-1b36-4ee1-860e-787d82ea77e2-kube-api-access-t4l7b\") pod \"controller-manager-879f6c89f-sbrcs\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.595869 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.616482 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.636602 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.656267 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.677407 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.696657 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.717956 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.729599 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.736694 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.751617 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.757997 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.766871 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.776679 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.780910 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.796395 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.818531 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.835944 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.856828 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.875951 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.895616 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.915905 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.937917 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.956677 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.976841 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 11:30:49 crc kubenswrapper[4669]: I1001 11:30:49.997820 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.014560 4669 request.go:700] Waited for 1.014994033s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.017266 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.051900 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.055903 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.058464 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4"] Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.076057 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.096795 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.115958 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.136244 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sbrcs"] Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.137175 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.155936 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.176972 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.196852 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.216442 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.237148 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.256445 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.267202 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7q7n5"] Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.270069 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-flpxd"] Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.276676 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: W1001 11:30:50.278051 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7445657_b8e4_4974_a680_7a05f0628fb7.slice/crio-4782b95df4568a6f01672c2f0445c8954e645a166e3dea38a59dfd906b6db05b WatchSource:0}: Error finding container 4782b95df4568a6f01672c2f0445c8954e645a166e3dea38a59dfd906b6db05b: Status 404 returned error can't find the container with id 4782b95df4568a6f01672c2f0445c8954e645a166e3dea38a59dfd906b6db05b Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.295792 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.316328 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.336643 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.356725 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.376312 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.402097 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.416528 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.437263 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.455953 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.478930 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.497214 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.516577 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.535962 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.557029 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.561152 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" event={"ID":"e7445657-b8e4-4974-a680-7a05f0628fb7","Type":"ContainerStarted","Data":"4782b95df4568a6f01672c2f0445c8954e645a166e3dea38a59dfd906b6db05b"} Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.562188 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" event={"ID":"75bcc3da-1b36-4ee1-860e-787d82ea77e2","Type":"ContainerStarted","Data":"b231f5d22917c84dd540a23b360c258c5cbb3fa95191ab9b3fe56cb743c659eb"} Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.563557 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" event={"ID":"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3","Type":"ContainerStarted","Data":"59d6f98d2b35cdbd75846408fac4357ef706529e4a0b97465527d87e3a2fa551"} Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.564541 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" event={"ID":"8f196659-a904-4e87-a32c-cae07c3911ea","Type":"ContainerStarted","Data":"4ab05264a75f714b44391c1916d2378d4257cc357cf80da08340b2fbfc10310d"} Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.576092 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.596101 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.617385 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.635915 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.655753 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.676011 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.696183 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.715489 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.736495 4669 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.756676 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.776419 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.796203 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.817421 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.836852 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.857266 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.876172 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.897020 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.974711 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-service-ca\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.975165 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-tls\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.975471 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1bf82d78-0b71-43b4-b6d3-babe39dd328e-auth-proxy-config\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.975576 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0debf10a-a4dd-43d7-84fd-3456a2ad1b59-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p62sv\" (UID: \"0debf10a-a4dd-43d7-84fd-3456a2ad1b59\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.975688 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-ca\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.975789 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.975903 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mgg\" (UniqueName: \"kubernetes.io/projected/0debf10a-a4dd-43d7-84fd-3456a2ad1b59-kube-api-access-88mgg\") pod \"cluster-samples-operator-665b6dd947-p62sv\" (UID: \"0debf10a-a4dd-43d7-84fd-3456a2ad1b59\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976052 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-oauth-serving-cert\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976247 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmjs\" (UniqueName: \"kubernetes.io/projected/4f9513dc-5114-4ec2-81ec-d86a31c3635b-kube-api-access-bpmjs\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976428 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976549 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1bf82d78-0b71-43b4-b6d3-babe39dd328e-machine-approver-tls\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976761 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976879 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.976990 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-825pq\" (UniqueName: \"kubernetes.io/projected/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-kube-api-access-825pq\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8e2e100b-8917-4730-83b1-2fc7716f740b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977279 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d223984b-062e-423a-bfb9-f28dc4dd215b-serving-cert\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977384 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00e38d1c-41cd-437a-a6e9-3a53fe903c11-trusted-ca\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977519 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-certificates\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977661 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8sz\" (UniqueName: \"kubernetes.io/projected/54979db4-1c85-4bfd-aec1-c154590ec33b-kube-api-access-bq8sz\") pod \"downloads-7954f5f757-hdrzq\" (UID: \"54979db4-1c85-4bfd-aec1-c154590ec33b\") " pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977764 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-serving-cert\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.977890 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58c2\" (UniqueName: \"kubernetes.io/projected/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-kube-api-access-v58c2\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.978006 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2e100b-8917-4730-83b1-2fc7716f740b-serving-cert\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.978147 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796ccf8d-b179-440a-87a8-c6de61d08d4a-serving-cert\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.978286 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-client\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.978463 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5100377-ee4b-4427-9106-eea735423f5a-serving-cert\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.978649 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.978875 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.979524 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d223984b-062e-423a-bfb9-f28dc4dd215b-trusted-ca\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.979698 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-oauth-config\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.979914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcz8l\" (UniqueName: \"kubernetes.io/projected/796ccf8d-b179-440a-87a8-c6de61d08d4a-kube-api-access-rcz8l\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.980518 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981003 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00e38d1c-41cd-437a-a6e9-3a53fe903c11-bound-sa-token\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981140 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-config\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981236 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00e38d1c-41cd-437a-a6e9-3a53fe903c11-metrics-tls\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981373 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxj8\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-kube-api-access-rrxj8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981477 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d223984b-062e-423a-bfb9-f28dc4dd215b-config\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981565 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981663 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981770 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981876 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.981991 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982132 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8958\" (UniqueName: \"kubernetes.io/projected/d223984b-062e-423a-bfb9-f28dc4dd215b-kube-api-access-x8958\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982217 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-trusted-ca-bundle\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982345 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982476 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-policies\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982525 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-config\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982563 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl65k\" (UniqueName: \"kubernetes.io/projected/1bf82d78-0b71-43b4-b6d3-babe39dd328e-kube-api-access-xl65k\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982870 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbgg6\" (UniqueName: \"kubernetes.io/projected/00e38d1c-41cd-437a-a6e9-3a53fe903c11-kube-api-access-nbgg6\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.982941 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-dir\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.983003 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-service-ca\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.983038 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.983540 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.983939 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9513dc-5114-4ec2-81ec-d86a31c3635b-serving-cert\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.984186 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.984353 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-trusted-ca\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.984421 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-console-config\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990275 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-client-ca\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990445 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf82d78-0b71-43b4-b6d3-babe39dd328e-config\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990507 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-bound-sa-token\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990560 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990612 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990655 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-config\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990749 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990804 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glcr\" (UniqueName: \"kubernetes.io/projected/b5100377-ee4b-4427-9106-eea735423f5a-kube-api-access-4glcr\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990850 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzcj7\" (UniqueName: \"kubernetes.io/projected/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-kube-api-access-qzcj7\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.990887 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.991026 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.991188 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2n8\" (UniqueName: \"kubernetes.io/projected/1467a745-44bf-40c6-a065-5008543d1363-kube-api-access-2d2n8\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.991231 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06fa5e25-562e-4bde-96d5-c0877aa235f7-metrics-tls\") pod \"dns-operator-744455d44c-6vbrp\" (UID: \"06fa5e25-562e-4bde-96d5-c0877aa235f7\") " pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.991289 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mtt\" (UniqueName: \"kubernetes.io/projected/8e2e100b-8917-4730-83b1-2fc7716f740b-kube-api-access-75mtt\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:50 crc kubenswrapper[4669]: I1001 11:30:50.991340 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnb74\" (UniqueName: \"kubernetes.io/projected/06fa5e25-562e-4bde-96d5-c0877aa235f7-kube-api-access-dnb74\") pod \"dns-operator-744455d44c-6vbrp\" (UID: \"06fa5e25-562e-4bde-96d5-c0877aa235f7\") " pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:50 crc kubenswrapper[4669]: E1001 11:30:50.992507 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.492478527 +0000 UTC m=+142.592043544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.092992 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093225 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2n8\" (UniqueName: \"kubernetes.io/projected/1467a745-44bf-40c6-a065-5008543d1363-kube-api-access-2d2n8\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093253 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06fa5e25-562e-4bde-96d5-c0877aa235f7-metrics-tls\") pod \"dns-operator-744455d44c-6vbrp\" (UID: \"06fa5e25-562e-4bde-96d5-c0877aa235f7\") " pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093276 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ef90082-710c-48db-81f4-535db9195c2f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093298 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3827dd5-b842-4000-8b2e-37f7cc411542-proxy-tls\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093313 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fcb235d-657d-4be7-bbb3-afee58c08df9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093331 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fcb235d-657d-4be7-bbb3-afee58c08df9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093348 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnb74\" (UniqueName: \"kubernetes.io/projected/06fa5e25-562e-4bde-96d5-c0877aa235f7-kube-api-access-dnb74\") pod \"dns-operator-744455d44c-6vbrp\" (UID: \"06fa5e25-562e-4bde-96d5-c0877aa235f7\") " pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093364 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-service-ca\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093381 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167650ce-b43a-4e35-93c1-a802838246dd-config\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093406 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed84780-32e8-41fe-a20d-4c7a633ee541-config-volume\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093426 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093441 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed84780-32e8-41fe-a20d-4c7a633ee541-secret-volume\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093460 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gddk\" (UniqueName: \"kubernetes.io/projected/4fcb235d-657d-4be7-bbb3-afee58c08df9-kube-api-access-5gddk\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093477 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/83f83da4-e855-4070-b524-4b7b789d0215-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jfjm9\" (UID: \"83f83da4-e855-4070-b524-4b7b789d0215\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093496 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k76q\" (UniqueName: \"kubernetes.io/projected/4dabd582-ba32-4518-920b-4cf38903dffc-kube-api-access-2k76q\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093516 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-ca\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093530 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/167650ce-b43a-4e35-93c1-a802838246dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093549 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgnw\" (UniqueName: \"kubernetes.io/projected/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-kube-api-access-lmgnw\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093564 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjqsn\" (UniqueName: \"kubernetes.io/projected/0ed84780-32e8-41fe-a20d-4c7a633ee541-kube-api-access-vjqsn\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093581 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b219d54-0074-4283-963c-9f53c7b270fd-srv-cert\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093597 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-mountpoint-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093649 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-registration-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093668 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-oauth-serving-cert\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093684 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkxkz\" (UniqueName: \"kubernetes.io/projected/4f48de82-89d5-4ef8-b5fe-71ef81240421-kube-api-access-lkxkz\") pod \"package-server-manager-789f6589d5-h4bpt\" (UID: \"4f48de82-89d5-4ef8-b5fe-71ef81240421\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093703 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3827dd5-b842-4000-8b2e-37f7cc411542-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093721 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/167650ce-b43a-4e35-93c1-a802838246dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093741 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8qq\" (UniqueName: \"kubernetes.io/projected/78380b45-27e9-43cf-8e16-c8c63e0c217f-kube-api-access-7r8qq\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093759 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1bf82d78-0b71-43b4-b6d3-babe39dd328e-machine-approver-tls\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.093780 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.094707 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8e2e100b-8917-4730-83b1-2fc7716f740b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.094797 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.594757226 +0000 UTC m=+142.694322193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.094870 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-825pq\" (UniqueName: \"kubernetes.io/projected/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-kube-api-access-825pq\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095005 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2j4\" (UniqueName: \"kubernetes.io/projected/fa1b934f-dde2-493a-be9e-962e002e3075-kube-api-access-hv2j4\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095032 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81a86689-3fbe-4668-9fe9-19113485da2f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d223984b-062e-423a-bfb9-f28dc4dd215b-serving-cert\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095103 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00e38d1c-41cd-437a-a6e9-3a53fe903c11-trusted-ca\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095129 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81a86689-3fbe-4668-9fe9-19113485da2f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095152 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-default-certificate\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095175 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-certificates\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095194 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796ccf8d-b179-440a-87a8-c6de61d08d4a-serving-cert\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095213 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22vqk\" (UniqueName: \"kubernetes.io/projected/4ef90082-710c-48db-81f4-535db9195c2f-kube-api-access-22vqk\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095230 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c2afea9b-5446-4746-86f9-db70b4916992-tmpfs\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095261 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d6022c9-7f75-48fa-98b8-a15e286c85b0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095280 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd95c71-623e-4ee4-aadf-752a8e07d362-service-ca-bundle\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095301 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5100377-ee4b-4427-9106-eea735423f5a-serving-cert\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095319 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d223984b-062e-423a-bfb9-f28dc4dd215b-trusted-ca\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095339 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcz8l\" (UniqueName: \"kubernetes.io/projected/796ccf8d-b179-440a-87a8-c6de61d08d4a-kube-api-access-rcz8l\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095357 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78380b45-27e9-43cf-8e16-c8c63e0c217f-proxy-tls\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095409 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-config\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095431 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lp27\" (UniqueName: \"kubernetes.io/projected/1d0ee8e1-4e70-40fe-8780-567c7b49825b-kube-api-access-6lp27\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095450 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095476 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095498 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095532 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095556 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-trusted-ca-bundle\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095644 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-csi-data-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095677 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-metrics-certs\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095723 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1b934f-dde2-493a-be9e-962e002e3075-serving-cert\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095783 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-service-ca\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095809 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96csf\" (UniqueName: \"kubernetes.io/projected/bc9c2ee1-684f-462b-be84-3cae1de6a0da-kube-api-access-96csf\") pod \"ingress-canary-q7cnf\" (UID: \"bc9c2ee1-684f-462b-be84-3cae1de6a0da\") " pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095831 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b219d54-0074-4283-963c-9f53c7b270fd-profile-collector-cert\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095884 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4dabd582-ba32-4518-920b-4cf38903dffc-node-bootstrap-token\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095910 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65cpm\" (UniqueName: \"kubernetes.io/projected/69d480e4-07d0-4081-a7b1-9c8568ee449a-kube-api-access-65cpm\") pod \"migrator-59844c95c7-k7fz4\" (UID: \"69d480e4-07d0-4081-a7b1-9c8568ee449a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.096042 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1b934f-dde2-493a-be9e-962e002e3075-config\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.095318 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8e2e100b-8917-4730-83b1-2fc7716f740b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.097150 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-trusted-ca-bundle\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.097810 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-config\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.097869 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4dabd582-ba32-4518-920b-4cf38903dffc-certs\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.097899 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d6022c9-7f75-48fa-98b8-a15e286c85b0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.098400 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955a43ae-0efe-491a-8b49-65c5d0251203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.098469 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.098493 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-trusted-ca\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.098617 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-service-ca\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.098851 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00e38d1c-41cd-437a-a6e9-3a53fe903c11-trusted-ca\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.100130 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-service-ca\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.100463 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.100572 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-certificates\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.100671 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-oauth-serving-cert\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101271 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-console-config\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101378 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc9c2ee1-684f-462b-be84-3cae1de6a0da-cert\") pod \"ingress-canary-q7cnf\" (UID: \"bc9c2ee1-684f-462b-be84-3cae1de6a0da\") " pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101443 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101452 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101469 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101510 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101566 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3827dd5-b842-4000-8b2e-37f7cc411542-images\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101656 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjztf\" (UniqueName: \"kubernetes.io/projected/f3827dd5-b842-4000-8b2e-37f7cc411542-kube-api-access-qjztf\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101713 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78380b45-27e9-43cf-8e16-c8c63e0c217f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101786 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101794 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d223984b-062e-423a-bfb9-f28dc4dd215b-trusted-ca\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101839 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg8x6\" (UniqueName: \"kubernetes.io/projected/1b219d54-0074-4283-963c-9f53c7b270fd-kube-api-access-mg8x6\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101890 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rtg\" (UniqueName: \"kubernetes.io/projected/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-kube-api-access-l8rtg\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101925 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-trusted-ca\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101921 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-console-config\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.101943 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f41d7856-7012-4ade-bc8c-8354d6537e9d-signing-cabundle\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102118 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102166 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75mtt\" (UniqueName: \"kubernetes.io/projected/8e2e100b-8917-4730-83b1-2fc7716f740b-kube-api-access-75mtt\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102196 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ef90082-710c-48db-81f4-535db9195c2f-srv-cert\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102248 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-tls\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102275 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1bf82d78-0b71-43b4-b6d3-babe39dd328e-auth-proxy-config\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102307 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0debf10a-a4dd-43d7-84fd-3456a2ad1b59-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p62sv\" (UID: \"0debf10a-a4dd-43d7-84fd-3456a2ad1b59\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.102326 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.602309882 +0000 UTC m=+142.701874859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102354 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsd22\" (UniqueName: \"kubernetes.io/projected/6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e-kube-api-access-jsd22\") pod \"multus-admission-controller-857f4d67dd-zxlg8\" (UID: \"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102383 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-plugins-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102416 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mgg\" (UniqueName: \"kubernetes.io/projected/0debf10a-a4dd-43d7-84fd-3456a2ad1b59-kube-api-access-88mgg\") pod \"cluster-samples-operator-665b6dd947-p62sv\" (UID: \"0debf10a-a4dd-43d7-84fd-3456a2ad1b59\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2afea9b-5446-4746-86f9-db70b4916992-webhook-cert\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102467 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7lq\" (UniqueName: \"kubernetes.io/projected/81a86689-3fbe-4668-9fe9-19113485da2f-kube-api-access-vl7lq\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102495 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmjs\" (UniqueName: \"kubernetes.io/projected/4f9513dc-5114-4ec2-81ec-d86a31c3635b-kube-api-access-bpmjs\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102519 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102549 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102595 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8sz\" (UniqueName: \"kubernetes.io/projected/54979db4-1c85-4bfd-aec1-c154590ec33b-kube-api-access-bq8sz\") pod \"downloads-7954f5f757-hdrzq\" (UID: \"54979db4-1c85-4bfd-aec1-c154590ec33b\") " pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102616 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-serving-cert\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102637 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v58c2\" (UniqueName: \"kubernetes.io/projected/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-kube-api-access-v58c2\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102668 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2e100b-8917-4730-83b1-2fc7716f740b-serving-cert\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102687 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2afea9b-5446-4746-86f9-db70b4916992-apiservice-cert\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102710 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qgn\" (UniqueName: \"kubernetes.io/projected/c2afea9b-5446-4746-86f9-db70b4916992-kube-api-access-s9qgn\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102731 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zxlg8\" (UID: \"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102748 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/955a43ae-0efe-491a-8b49-65c5d0251203-config\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102769 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102794 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-client\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-oauth-config\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102834 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-stats-auth\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102855 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102876 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00e38d1c-41cd-437a-a6e9-3a53fe903c11-bound-sa-token\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102897 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtnn\" (UniqueName: \"kubernetes.io/projected/f41d7856-7012-4ade-bc8c-8354d6537e9d-kube-api-access-sdtnn\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102922 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t297l\" (UniqueName: \"kubernetes.io/projected/cfd95c71-623e-4ee4-aadf-752a8e07d362-kube-api-access-t297l\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102950 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00e38d1c-41cd-437a-a6e9-3a53fe903c11-metrics-tls\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102977 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxj8\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-kube-api-access-rrxj8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103005 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d223984b-062e-423a-bfb9-f28dc4dd215b-config\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103128 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103198 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8958\" (UniqueName: \"kubernetes.io/projected/d223984b-062e-423a-bfb9-f28dc4dd215b-kube-api-access-x8958\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103259 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103364 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1bf82d78-0b71-43b4-b6d3-babe39dd328e-auth-proxy-config\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103388 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-policies\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103440 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-config\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103489 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl65k\" (UniqueName: \"kubernetes.io/projected/1bf82d78-0b71-43b4-b6d3-babe39dd328e-kube-api-access-xl65k\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103539 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d6022c9-7f75-48fa-98b8-a15e286c85b0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103597 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-dir\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103651 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbgg6\" (UniqueName: \"kubernetes.io/projected/00e38d1c-41cd-437a-a6e9-3a53fe903c11-kube-api-access-nbgg6\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103702 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-metrics-tls\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103753 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103803 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-config-volume\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103867 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103929 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9513dc-5114-4ec2-81ec-d86a31c3635b-serving-cert\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103958 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d223984b-062e-423a-bfb9-f28dc4dd215b-config\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.103983 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-client-ca\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104040 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf82d78-0b71-43b4-b6d3-babe39dd328e-config\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104181 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-bound-sa-token\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104281 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-config\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104338 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmqz\" (UniqueName: \"kubernetes.io/projected/83f83da4-e855-4070-b524-4b7b789d0215-kube-api-access-vqmqz\") pod \"control-plane-machine-set-operator-78cbb6b69f-jfjm9\" (UID: \"83f83da4-e855-4070-b524-4b7b789d0215\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104401 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104449 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f41d7856-7012-4ade-bc8c-8354d6537e9d-signing-key\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104504 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.104512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.105312 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4glcr\" (UniqueName: \"kubernetes.io/projected/b5100377-ee4b-4427-9106-eea735423f5a-kube-api-access-4glcr\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.105382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzcj7\" (UniqueName: \"kubernetes.io/projected/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-kube-api-access-qzcj7\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.105479 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f48de82-89d5-4ef8-b5fe-71ef81240421-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h4bpt\" (UID: \"4f48de82-89d5-4ef8-b5fe-71ef81240421\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.105544 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-socket-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.105611 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/955a43ae-0efe-491a-8b49-65c5d0251203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.102968 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.106967 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.107422 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-dir\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.108547 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.109143 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf82d78-0b71-43b4-b6d3-babe39dd328e-config\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.111069 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.111967 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.114532 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-client\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.115169 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-client-ca\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.115910 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9513dc-5114-4ec2-81ec-d86a31c3635b-serving-cert\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.115921 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.115987 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-config\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.116170 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f9513dc-5114-4ec2-81ec-d86a31c3635b-etcd-ca\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.117656 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d223984b-062e-423a-bfb9-f28dc4dd215b-serving-cert\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.117966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/796ccf8d-b179-440a-87a8-c6de61d08d4a-serving-cert\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.118851 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-config\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.119753 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0debf10a-a4dd-43d7-84fd-3456a2ad1b59-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p62sv\" (UID: \"0debf10a-a4dd-43d7-84fd-3456a2ad1b59\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.119226 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.119237 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00e38d1c-41cd-437a-a6e9-3a53fe903c11-metrics-tls\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.119713 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1bf82d78-0b71-43b4-b6d3-babe39dd328e-machine-approver-tls\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.119748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-policies\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.119470 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.120252 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06fa5e25-562e-4bde-96d5-c0877aa235f7-metrics-tls\") pod \"dns-operator-744455d44c-6vbrp\" (UID: \"06fa5e25-562e-4bde-96d5-c0877aa235f7\") " pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.120356 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.121146 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5100377-ee4b-4427-9106-eea735423f5a-serving-cert\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.121387 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.122775 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.122810 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2e100b-8917-4730-83b1-2fc7716f740b-serving-cert\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.118236 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796ccf8d-b179-440a-87a8-c6de61d08d4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.123268 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-serving-cert\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.124853 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.125251 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.128094 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-oauth-config\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.132057 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-825pq\" (UniqueName: \"kubernetes.io/projected/d3392045-54bc-4b2a-a1f8-b7ac9f0d145b-kube-api-access-825pq\") pod \"openshift-controller-manager-operator-756b6f6bc6-zp57s\" (UID: \"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.134036 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-tls\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.144915 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.157553 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnb74\" (UniqueName: \"kubernetes.io/projected/06fa5e25-562e-4bde-96d5-c0877aa235f7-kube-api-access-dnb74\") pod \"dns-operator-744455d44c-6vbrp\" (UID: \"06fa5e25-562e-4bde-96d5-c0877aa235f7\") " pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.172604 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcz8l\" (UniqueName: \"kubernetes.io/projected/796ccf8d-b179-440a-87a8-c6de61d08d4a-kube-api-access-rcz8l\") pod \"authentication-operator-69f744f599-wn6cw\" (UID: \"796ccf8d-b179-440a-87a8-c6de61d08d4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.197057 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2n8\" (UniqueName: \"kubernetes.io/projected/1467a745-44bf-40c6-a065-5008543d1363-kube-api-access-2d2n8\") pod \"console-f9d7485db-cclkd\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.206654 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.206841 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.706803696 +0000 UTC m=+142.806368673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.206972 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lp27\" (UniqueName: \"kubernetes.io/projected/1d0ee8e1-4e70-40fe-8780-567c7b49825b-kube-api-access-6lp27\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207055 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-csi-data-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207116 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-metrics-certs\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207143 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1b934f-dde2-493a-be9e-962e002e3075-serving-cert\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207169 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96csf\" (UniqueName: \"kubernetes.io/projected/bc9c2ee1-684f-462b-be84-3cae1de6a0da-kube-api-access-96csf\") pod \"ingress-canary-q7cnf\" (UID: \"bc9c2ee1-684f-462b-be84-3cae1de6a0da\") " pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207195 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b219d54-0074-4283-963c-9f53c7b270fd-profile-collector-cert\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207219 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4dabd582-ba32-4518-920b-4cf38903dffc-node-bootstrap-token\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207245 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1b934f-dde2-493a-be9e-962e002e3075-config\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207270 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4dabd582-ba32-4518-920b-4cf38903dffc-certs\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207296 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d6022c9-7f75-48fa-98b8-a15e286c85b0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207319 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955a43ae-0efe-491a-8b49-65c5d0251203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207356 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65cpm\" (UniqueName: \"kubernetes.io/projected/69d480e4-07d0-4081-a7b1-9c8568ee449a-kube-api-access-65cpm\") pod \"migrator-59844c95c7-k7fz4\" (UID: \"69d480e4-07d0-4081-a7b1-9c8568ee449a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207357 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-csi-data-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207379 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc9c2ee1-684f-462b-be84-3cae1de6a0da-cert\") pod \"ingress-canary-q7cnf\" (UID: \"bc9c2ee1-684f-462b-be84-3cae1de6a0da\") " pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207531 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3827dd5-b842-4000-8b2e-37f7cc411542-images\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207569 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjztf\" (UniqueName: \"kubernetes.io/projected/f3827dd5-b842-4000-8b2e-37f7cc411542-kube-api-access-qjztf\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207603 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78380b45-27e9-43cf-8e16-c8c63e0c217f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207636 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg8x6\" (UniqueName: \"kubernetes.io/projected/1b219d54-0074-4283-963c-9f53c7b270fd-kube-api-access-mg8x6\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207666 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rtg\" (UniqueName: \"kubernetes.io/projected/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-kube-api-access-l8rtg\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207719 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207757 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f41d7856-7012-4ade-bc8c-8354d6537e9d-signing-cabundle\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207795 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ef90082-710c-48db-81f4-535db9195c2f-srv-cert\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207875 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsd22\" (UniqueName: \"kubernetes.io/projected/6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e-kube-api-access-jsd22\") pod \"multus-admission-controller-857f4d67dd-zxlg8\" (UID: \"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207898 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-plugins-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207937 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2afea9b-5446-4746-86f9-db70b4916992-webhook-cert\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.207974 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7lq\" (UniqueName: \"kubernetes.io/projected/81a86689-3fbe-4668-9fe9-19113485da2f-kube-api-access-vl7lq\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208101 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2afea9b-5446-4746-86f9-db70b4916992-apiservice-cert\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208133 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qgn\" (UniqueName: \"kubernetes.io/projected/c2afea9b-5446-4746-86f9-db70b4916992-kube-api-access-s9qgn\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208176 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zxlg8\" (UID: \"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208207 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/955a43ae-0efe-491a-8b49-65c5d0251203-config\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208236 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-stats-auth\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208284 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtnn\" (UniqueName: \"kubernetes.io/projected/f41d7856-7012-4ade-bc8c-8354d6537e9d-kube-api-access-sdtnn\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208337 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t297l\" (UniqueName: \"kubernetes.io/projected/cfd95c71-623e-4ee4-aadf-752a8e07d362-kube-api-access-t297l\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208426 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d6022c9-7f75-48fa-98b8-a15e286c85b0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208467 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-config-volume\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.208491 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-metrics-tls\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209257 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmqz\" (UniqueName: \"kubernetes.io/projected/83f83da4-e855-4070-b524-4b7b789d0215-kube-api-access-vqmqz\") pod \"control-plane-machine-set-operator-78cbb6b69f-jfjm9\" (UID: \"83f83da4-e855-4070-b524-4b7b789d0215\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209309 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209336 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f41d7856-7012-4ade-bc8c-8354d6537e9d-signing-key\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209399 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-socket-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209428 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/955a43ae-0efe-491a-8b49-65c5d0251203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209454 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f48de82-89d5-4ef8-b5fe-71ef81240421-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h4bpt\" (UID: \"4f48de82-89d5-4ef8-b5fe-71ef81240421\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209481 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ef90082-710c-48db-81f4-535db9195c2f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fcb235d-657d-4be7-bbb3-afee58c08df9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209536 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fcb235d-657d-4be7-bbb3-afee58c08df9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209565 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3827dd5-b842-4000-8b2e-37f7cc411542-proxy-tls\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209591 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167650ce-b43a-4e35-93c1-a802838246dd-config\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209616 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed84780-32e8-41fe-a20d-4c7a633ee541-config-volume\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209647 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed84780-32e8-41fe-a20d-4c7a633ee541-secret-volume\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209671 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gddk\" (UniqueName: \"kubernetes.io/projected/4fcb235d-657d-4be7-bbb3-afee58c08df9-kube-api-access-5gddk\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209700 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/83f83da4-e855-4070-b524-4b7b789d0215-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jfjm9\" (UID: \"83f83da4-e855-4070-b524-4b7b789d0215\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209728 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k76q\" (UniqueName: \"kubernetes.io/projected/4dabd582-ba32-4518-920b-4cf38903dffc-kube-api-access-2k76q\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209756 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgnw\" (UniqueName: \"kubernetes.io/projected/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-kube-api-access-lmgnw\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209784 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjqsn\" (UniqueName: \"kubernetes.io/projected/0ed84780-32e8-41fe-a20d-4c7a633ee541-kube-api-access-vjqsn\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209808 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/167650ce-b43a-4e35-93c1-a802838246dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209832 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-mountpoint-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209855 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-registration-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.209942 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b219d54-0074-4283-963c-9f53c7b270fd-srv-cert\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.210244 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3827dd5-b842-4000-8b2e-37f7cc411542-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.210245 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1b934f-dde2-493a-be9e-962e002e3075-config\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.210279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/167650ce-b43a-4e35-93c1-a802838246dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.210342 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8qq\" (UniqueName: \"kubernetes.io/projected/78380b45-27e9-43cf-8e16-c8c63e0c217f-kube-api-access-7r8qq\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211228 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkxkz\" (UniqueName: \"kubernetes.io/projected/4f48de82-89d5-4ef8-b5fe-71ef81240421-kube-api-access-lkxkz\") pod \"package-server-manager-789f6589d5-h4bpt\" (UID: \"4f48de82-89d5-4ef8-b5fe-71ef81240421\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211290 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2j4\" (UniqueName: \"kubernetes.io/projected/fa1b934f-dde2-493a-be9e-962e002e3075-kube-api-access-hv2j4\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211329 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81a86689-3fbe-4668-9fe9-19113485da2f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211330 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d6022c9-7f75-48fa-98b8-a15e286c85b0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211364 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81a86689-3fbe-4668-9fe9-19113485da2f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211394 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-default-certificate\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211431 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22vqk\" (UniqueName: \"kubernetes.io/projected/4ef90082-710c-48db-81f4-535db9195c2f-kube-api-access-22vqk\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211458 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c2afea9b-5446-4746-86f9-db70b4916992-tmpfs\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211484 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d6022c9-7f75-48fa-98b8-a15e286c85b0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd95c71-623e-4ee4-aadf-752a8e07d362-service-ca-bundle\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211551 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78380b45-27e9-43cf-8e16-c8c63e0c217f-proxy-tls\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.211625 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955a43ae-0efe-491a-8b49-65c5d0251203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.212410 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3827dd5-b842-4000-8b2e-37f7cc411542-images\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.212858 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/955a43ae-0efe-491a-8b49-65c5d0251203-config\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.213305 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1b934f-dde2-493a-be9e-962e002e3075-serving-cert\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.213722 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc9c2ee1-684f-462b-be84-3cae1de6a0da-cert\") pod \"ingress-canary-q7cnf\" (UID: \"bc9c2ee1-684f-462b-be84-3cae1de6a0da\") " pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.213806 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f41d7856-7012-4ade-bc8c-8354d6537e9d-signing-cabundle\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.213883 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-metrics-certs\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.214593 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-plugins-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.215957 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78380b45-27e9-43cf-8e16-c8c63e0c217f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.217204 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mtt\" (UniqueName: \"kubernetes.io/projected/8e2e100b-8917-4730-83b1-2fc7716f740b-kube-api-access-75mtt\") pod \"openshift-config-operator-7777fb866f-2qkmt\" (UID: \"8e2e100b-8917-4730-83b1-2fc7716f740b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.217783 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.217999 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.717973652 +0000 UTC m=+142.817538639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.218267 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b219d54-0074-4283-963c-9f53c7b270fd-profile-collector-cert\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.218530 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-stats-auth\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.218905 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4dabd582-ba32-4518-920b-4cf38903dffc-node-bootstrap-token\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.218984 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4dabd582-ba32-4518-920b-4cf38903dffc-certs\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.219044 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78380b45-27e9-43cf-8e16-c8c63e0c217f-proxy-tls\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.219957 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.220390 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ef90082-710c-48db-81f4-535db9195c2f-srv-cert\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.220504 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-socket-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.220803 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c2afea9b-5446-4746-86f9-db70b4916992-tmpfs\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.221166 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2afea9b-5446-4746-86f9-db70b4916992-webhook-cert\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.221347 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d6022c9-7f75-48fa-98b8-a15e286c85b0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.221545 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-metrics-tls\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.221895 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfd95c71-623e-4ee4-aadf-752a8e07d362-service-ca-bundle\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.219970 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81a86689-3fbe-4668-9fe9-19113485da2f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.222378 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-config-volume\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.222534 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f41d7856-7012-4ade-bc8c-8354d6537e9d-signing-key\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.222947 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3827dd5-b842-4000-8b2e-37f7cc411542-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.223248 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-mountpoint-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.223284 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81a86689-3fbe-4668-9fe9-19113485da2f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.223576 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed84780-32e8-41fe-a20d-4c7a633ee541-config-volume\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.224168 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1d0ee8e1-4e70-40fe-8780-567c7b49825b-registration-dir\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.224829 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zxlg8\" (UID: \"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.225343 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cfd95c71-623e-4ee4-aadf-752a8e07d362-default-certificate\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.225345 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fcb235d-657d-4be7-bbb3-afee58c08df9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.225991 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2afea9b-5446-4746-86f9-db70b4916992-apiservice-cert\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.226853 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ef90082-710c-48db-81f4-535db9195c2f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.227786 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3827dd5-b842-4000-8b2e-37f7cc411542-proxy-tls\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.228809 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/83f83da4-e855-4070-b524-4b7b789d0215-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jfjm9\" (UID: \"83f83da4-e855-4070-b524-4b7b789d0215\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.229575 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b219d54-0074-4283-963c-9f53c7b270fd-srv-cert\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.231632 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f48de82-89d5-4ef8-b5fe-71ef81240421-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h4bpt\" (UID: \"4f48de82-89d5-4ef8-b5fe-71ef81240421\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.256593 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxj8\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-kube-api-access-rrxj8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.271873 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mgg\" (UniqueName: \"kubernetes.io/projected/0debf10a-a4dd-43d7-84fd-3456a2ad1b59-kube-api-access-88mgg\") pod \"cluster-samples-operator-665b6dd947-p62sv\" (UID: \"0debf10a-a4dd-43d7-84fd-3456a2ad1b59\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.293759 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.296852 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmjs\" (UniqueName: \"kubernetes.io/projected/4f9513dc-5114-4ec2-81ec-d86a31c3635b-kube-api-access-bpmjs\") pod \"etcd-operator-b45778765-x95jd\" (UID: \"4f9513dc-5114-4ec2-81ec-d86a31c3635b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.301184 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fcb235d-657d-4be7-bbb3-afee58c08df9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.301250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167650ce-b43a-4e35-93c1-a802838246dd-config\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.302800 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/167650ce-b43a-4e35-93c1-a802838246dd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.304166 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed84780-32e8-41fe-a20d-4c7a633ee541-secret-volume\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.304712 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00e38d1c-41cd-437a-a6e9-3a53fe903c11-bound-sa-token\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.312466 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.312930 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.81289066 +0000 UTC m=+142.912455637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.313448 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.314188 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.813999577 +0000 UTC m=+142.913564604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.316043 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.330966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8sz\" (UniqueName: \"kubernetes.io/projected/54979db4-1c85-4bfd-aec1-c154590ec33b-kube-api-access-bq8sz\") pod \"downloads-7954f5f757-hdrzq\" (UID: \"54979db4-1c85-4bfd-aec1-c154590ec33b\") " pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.358730 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl65k\" (UniqueName: \"kubernetes.io/projected/1bf82d78-0b71-43b4-b6d3-babe39dd328e-kube-api-access-xl65k\") pod \"machine-approver-56656f9798-9x9hq\" (UID: \"1bf82d78-0b71-43b4-b6d3-babe39dd328e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.376115 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glcr\" (UniqueName: \"kubernetes.io/projected/b5100377-ee4b-4427-9106-eea735423f5a-kube-api-access-4glcr\") pod \"route-controller-manager-6576b87f9c-2r4jf\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.385766 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.392929 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.399948 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzcj7\" (UniqueName: \"kubernetes.io/projected/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-kube-api-access-qzcj7\") pod \"oauth-openshift-558db77b4-8hc7m\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.408297 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.414792 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.415443 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbgg6\" (UniqueName: \"kubernetes.io/projected/00e38d1c-41cd-437a-a6e9-3a53fe903c11-kube-api-access-nbgg6\") pod \"ingress-operator-5b745b69d9-56qlt\" (UID: \"00e38d1c-41cd-437a-a6e9-3a53fe903c11\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.415748 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.415887 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:51.915851496 +0000 UTC m=+143.015416493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.433920 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.436241 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.438657 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v58c2\" (UniqueName: \"kubernetes.io/projected/ef2d43d6-2138-4c6f-9bd1-09a621ebda8c-kube-api-access-v58c2\") pod \"cluster-image-registry-operator-dc59b4c8b-k852r\" (UID: \"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.441378 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.453948 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-bound-sa-token\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.459375 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.465032 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.471523 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8958\" (UniqueName: \"kubernetes.io/projected/d223984b-062e-423a-bfb9-f28dc4dd215b-kube-api-access-x8958\") pod \"console-operator-58897d9998-qqhrp\" (UID: \"d223984b-062e-423a-bfb9-f28dc4dd215b\") " pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.472924 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.487777 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cclkd"] Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.488728 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.515298 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lp27\" (UniqueName: \"kubernetes.io/projected/1d0ee8e1-4e70-40fe-8780-567c7b49825b-kube-api-access-6lp27\") pod \"csi-hostpathplugin-q8qw4\" (UID: \"1d0ee8e1-4e70-40fe-8780-567c7b49825b\") " pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.516470 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.517023 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.017007178 +0000 UTC m=+143.116572155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.539527 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96csf\" (UniqueName: \"kubernetes.io/projected/bc9c2ee1-684f-462b-be84-3cae1de6a0da-kube-api-access-96csf\") pod \"ingress-canary-q7cnf\" (UID: \"bc9c2ee1-684f-462b-be84-3cae1de6a0da\") " pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.550330 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qgn\" (UniqueName: \"kubernetes.io/projected/c2afea9b-5446-4746-86f9-db70b4916992-kube-api-access-s9qgn\") pod \"packageserver-d55dfcdfc-sfgk6\" (UID: \"c2afea9b-5446-4746-86f9-db70b4916992\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.579026 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtnn\" (UniqueName: \"kubernetes.io/projected/f41d7856-7012-4ade-bc8c-8354d6537e9d-kube-api-access-sdtnn\") pod \"service-ca-9c57cc56f-44vxs\" (UID: \"f41d7856-7012-4ade-bc8c-8354d6537e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.584116 4669 generic.go:334] "Generic (PLEG): container finished" podID="8f196659-a904-4e87-a32c-cae07c3911ea" containerID="25ecbf260c72e3bf18444c105f1094e215f6b9c821d3cafb5ea1caad37886b63" exitCode=0 Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.584194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" event={"ID":"8f196659-a904-4e87-a32c-cae07c3911ea","Type":"ContainerDied","Data":"25ecbf260c72e3bf18444c105f1094e215f6b9c821d3cafb5ea1caad37886b63"} Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.591402 4669 generic.go:334] "Generic (PLEG): container finished" podID="4c8e059d-1baa-4142-a6e9-af3c9bfe16d3" containerID="f10e7077568b7b2f7cd70735c5077dca7c4ed59f76498536b041dd8407a2a598" exitCode=0 Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.591519 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" event={"ID":"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3","Type":"ContainerDied","Data":"f10e7077568b7b2f7cd70735c5077dca7c4ed59f76498536b041dd8407a2a598"} Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.597635 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.601166 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjztf\" (UniqueName: \"kubernetes.io/projected/f3827dd5-b842-4000-8b2e-37f7cc411542-kube-api-access-qjztf\") pod \"machine-config-operator-74547568cd-sbx7t\" (UID: \"f3827dd5-b842-4000-8b2e-37f7cc411542\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.602885 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" event={"ID":"e7445657-b8e4-4974-a680-7a05f0628fb7","Type":"ContainerStarted","Data":"e132459c40ad1bc98269ba4d6970690d131c6b254fa0ed8254cc7acd7a5f5838"} Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.602937 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" event={"ID":"e7445657-b8e4-4974-a680-7a05f0628fb7","Type":"ContainerStarted","Data":"002a0b0083efd05a04d8955f54d30307f028794dd86673ded4178eedeccb59a3"} Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.609592 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cclkd" event={"ID":"1467a745-44bf-40c6-a065-5008543d1363","Type":"ContainerStarted","Data":"6c458cc069204979441f44c6bdcb90c38013bfa70b6bc3cc74bc1bd934e1730b"} Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.620665 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.620974 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.120936419 +0000 UTC m=+143.220501396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.621363 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.621549 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7lq\" (UniqueName: \"kubernetes.io/projected/81a86689-3fbe-4668-9fe9-19113485da2f-kube-api-access-vl7lq\") pod \"openshift-apiserver-operator-796bbdcf4f-f48nt\" (UID: \"81a86689-3fbe-4668-9fe9-19113485da2f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.621828 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.12182039 +0000 UTC m=+143.221385367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.631878 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65cpm\" (UniqueName: \"kubernetes.io/projected/69d480e4-07d0-4081-a7b1-9c8568ee449a-kube-api-access-65cpm\") pod \"migrator-59844c95c7-k7fz4\" (UID: \"69d480e4-07d0-4081-a7b1-9c8568ee449a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.640668 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.644218 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" event={"ID":"75bcc3da-1b36-4ee1-860e-787d82ea77e2","Type":"ContainerStarted","Data":"1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e"} Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.645153 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.646695 4669 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sbrcs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.646727 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.651021 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q7cnf" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.651801 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.653756 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rtg\" (UniqueName: \"kubernetes.io/projected/dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2-kube-api-access-l8rtg\") pod \"dns-default-5bjch\" (UID: \"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2\") " pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.657404 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.679983 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg8x6\" (UniqueName: \"kubernetes.io/projected/1b219d54-0074-4283-963c-9f53c7b270fd-kube-api-access-mg8x6\") pod \"catalog-operator-68c6474976-czlnv\" (UID: \"1b219d54-0074-4283-963c-9f53c7b270fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.699344 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.710759 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/955a43ae-0efe-491a-8b49-65c5d0251203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwnc7\" (UID: \"955a43ae-0efe-491a-8b49-65c5d0251203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.712189 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.717871 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv"] Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.717909 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wn6cw"] Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.719663 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t297l\" (UniqueName: \"kubernetes.io/projected/cfd95c71-623e-4ee4-aadf-752a8e07d362-kube-api-access-t297l\") pod \"router-default-5444994796-gmqg9\" (UID: \"cfd95c71-623e-4ee4-aadf-752a8e07d362\") " pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.722049 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.722329 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.222303315 +0000 UTC m=+143.321868292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.725997 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.726870 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.226854547 +0000 UTC m=+143.326419524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.727975 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hdrzq"] Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.750070 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22vqk\" (UniqueName: \"kubernetes.io/projected/4ef90082-710c-48db-81f4-535db9195c2f-kube-api-access-22vqk\") pod \"olm-operator-6b444d44fb-6nqdz\" (UID: \"4ef90082-710c-48db-81f4-535db9195c2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.752745 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d6022c9-7f75-48fa-98b8-a15e286c85b0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5d22b\" (UID: \"5d6022c9-7f75-48fa-98b8-a15e286c85b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.775736 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmqz\" (UniqueName: \"kubernetes.io/projected/83f83da4-e855-4070-b524-4b7b789d0215-kube-api-access-vqmqz\") pod \"control-plane-machine-set-operator-78cbb6b69f-jfjm9\" (UID: \"83f83da4-e855-4070-b524-4b7b789d0215\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.791454 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsd22\" (UniqueName: \"kubernetes.io/projected/6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e-kube-api-access-jsd22\") pod \"multus-admission-controller-857f4d67dd-zxlg8\" (UID: \"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.794219 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.802717 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.810824 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.814510 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8qq\" (UniqueName: \"kubernetes.io/projected/78380b45-27e9-43cf-8e16-c8c63e0c217f-kube-api-access-7r8qq\") pod \"machine-config-controller-84d6567774-hhwl4\" (UID: \"78380b45-27e9-43cf-8e16-c8c63e0c217f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.829888 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjqsn\" (UniqueName: \"kubernetes.io/projected/0ed84780-32e8-41fe-a20d-4c7a633ee541-kube-api-access-vjqsn\") pod \"collect-profiles-29321970-cpbkk\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.830302 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.830397 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.330371607 +0000 UTC m=+143.429936584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.830540 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.831833 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.832214 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.332197513 +0000 UTC m=+143.431762490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.841870 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.849572 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:51 crc kubenswrapper[4669]: W1001 11:30:51.857289 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54979db4_1c85_4bfd_aec1_c154590ec33b.slice/crio-96c636dab5333d851c932b00e81ffd8faa1d7429ca608d6be191f3987a8abad7 WatchSource:0}: Error finding container 96c636dab5333d851c932b00e81ffd8faa1d7429ca608d6be191f3987a8abad7: Status 404 returned error can't find the container with id 96c636dab5333d851c932b00e81ffd8faa1d7429ca608d6be191f3987a8abad7 Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.859430 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k76q\" (UniqueName: \"kubernetes.io/projected/4dabd582-ba32-4518-920b-4cf38903dffc-kube-api-access-2k76q\") pod \"machine-config-server-x52td\" (UID: \"4dabd582-ba32-4518-920b-4cf38903dffc\") " pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.868942 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.872572 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgnw\" (UniqueName: \"kubernetes.io/projected/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-kube-api-access-lmgnw\") pod \"marketplace-operator-79b997595-dcmjv\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.881440 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.890108 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/167650ce-b43a-4e35-93c1-a802838246dd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vhvv\" (UID: \"167650ce-b43a-4e35-93c1-a802838246dd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.905910 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.913432 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.916475 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkxkz\" (UniqueName: \"kubernetes.io/projected/4f48de82-89d5-4ef8-b5fe-71ef81240421-kube-api-access-lkxkz\") pod \"package-server-manager-789f6589d5-h4bpt\" (UID: \"4f48de82-89d5-4ef8-b5fe-71ef81240421\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.922867 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.928987 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2j4\" (UniqueName: \"kubernetes.io/projected/fa1b934f-dde2-493a-be9e-962e002e3075-kube-api-access-hv2j4\") pod \"service-ca-operator-777779d784-dqh7b\" (UID: \"fa1b934f-dde2-493a-be9e-962e002e3075\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.931129 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.933247 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:51 crc kubenswrapper[4669]: E1001 11:30:51.933672 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.433656242 +0000 UTC m=+143.533221219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.940827 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.949780 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt"] Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.949813 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gddk\" (UniqueName: \"kubernetes.io/projected/4fcb235d-657d-4be7-bbb3-afee58c08df9-kube-api-access-5gddk\") pod \"kube-storage-version-migrator-operator-b67b599dd-v5z8k\" (UID: \"4fcb235d-657d-4be7-bbb3-afee58c08df9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.969326 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" Oct 01 11:30:51 crc kubenswrapper[4669]: I1001 11:30:51.977045 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.010493 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x52td" Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.026109 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.041896 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.042540 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.542526374 +0000 UTC m=+143.642091351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.085374 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.118689 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6vbrp"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.118799 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.122289 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.133620 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8hc7m"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.143189 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.143394 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.643373718 +0000 UTC m=+143.742938695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.143460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.143740 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.643733077 +0000 UTC m=+143.743298054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.155328 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.181002 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.194490 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x95jd"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.205309 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.245073 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.245348 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.74530548 +0000 UTC m=+143.844870457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.245560 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.246252 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.746220261 +0000 UTC m=+143.845785278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: W1001 11:30:52.283056 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00e38d1c_41cd_437a_a6e9_3a53fe903c11.slice/crio-3c6feb054d3439058133f8960a3bfed2252db681c0a7a7dccd8239bd1d688662 WatchSource:0}: Error finding container 3c6feb054d3439058133f8960a3bfed2252db681c0a7a7dccd8239bd1d688662: Status 404 returned error can't find the container with id 3c6feb054d3439058133f8960a3bfed2252db681c0a7a7dccd8239bd1d688662 Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.346890 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.347334 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.847300341 +0000 UTC m=+143.946865338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.347554 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.347951 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.847938007 +0000 UTC m=+143.947502984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.450354 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.450522 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.950494234 +0000 UTC m=+144.050059231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.450784 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.451122 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:52.951112549 +0000 UTC m=+144.050677526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.551823 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.552419 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.052375724 +0000 UTC m=+144.151940721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.579665 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q8qw4"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.651530 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hdrzq" event={"ID":"54979db4-1c85-4bfd-aec1-c154590ec33b","Type":"ContainerStarted","Data":"96c636dab5333d851c932b00e81ffd8faa1d7429ca608d6be191f3987a8abad7"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.652105 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5bjch"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.653032 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.653060 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" event={"ID":"4f9513dc-5114-4ec2-81ec-d86a31c3635b","Type":"ContainerStarted","Data":"05bb20f95bfd4801c602f82dd3ab04edecac9d9b52e4145a07cb22139d688568"} Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.653469 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.153449173 +0000 UTC m=+144.253014150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.653878 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" event={"ID":"8e2e100b-8917-4730-83b1-2fc7716f740b","Type":"ContainerStarted","Data":"3b3c6bba9ff0be3e54fec59f37ad6e546a338d3d00394bafb6557a388c8c8195"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.657427 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" event={"ID":"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e","Type":"ContainerStarted","Data":"ad520c74ea111e60b9af7480fec4a63baac8559b51ca99cf7d150b9a3eca0626"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.661381 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" event={"ID":"b5100377-ee4b-4427-9106-eea735423f5a","Type":"ContainerStarted","Data":"5e75efc9000483c2b7ef2f9e622ceff962ccfeb2cafcc57a4b324bb7fc023f09"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.664213 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" event={"ID":"c2afea9b-5446-4746-86f9-db70b4916992","Type":"ContainerStarted","Data":"bc0fe73dee4c132b471d8872d057483ec9f9b5b4e4b90a416f9b8c6a875d83dc"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.666487 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" event={"ID":"796ccf8d-b179-440a-87a8-c6de61d08d4a","Type":"ContainerStarted","Data":"0a0f9128c67c3378cfc42fb51937b418c815b35a84e496cf2a457a626af4644d"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.669659 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" event={"ID":"00e38d1c-41cd-437a-a6e9-3a53fe903c11","Type":"ContainerStarted","Data":"3c6feb054d3439058133f8960a3bfed2252db681c0a7a7dccd8239bd1d688662"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.670695 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" event={"ID":"06fa5e25-562e-4bde-96d5-c0877aa235f7","Type":"ContainerStarted","Data":"3ecf14199afab8b66422a61ed49a955f842090191c512168da946f7db2503f74"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.671402 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" event={"ID":"1bf82d78-0b71-43b4-b6d3-babe39dd328e","Type":"ContainerStarted","Data":"c573ea7b6f771929b206be9b694fa516aeb6d844c48a5c34c89d5c7c862b9041"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.672693 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" event={"ID":"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c","Type":"ContainerStarted","Data":"7ea65b996822eaef746af64a36d8d134ae8f6a3f48389b0e915df52b9820f2ee"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.673673 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" event={"ID":"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b","Type":"ContainerStarted","Data":"52af104ce9153e2b92836bb7aae318f745b9b61b7014b22cc6fc87ee8c4329b1"} Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.674782 4669 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sbrcs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.674813 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.703412 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q7cnf"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.718804 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qqhrp"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.737443 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.754718 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.754806 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.2547844 +0000 UTC m=+144.354349377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.755063 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.755426 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.255404455 +0000 UTC m=+144.354969522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.787365 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.856532 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.856754 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.35670418 +0000 UTC m=+144.456269157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.875011 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-44vxs"] Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.957819 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:52 crc kubenswrapper[4669]: E1001 11:30:52.958382 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.458356175 +0000 UTC m=+144.557921162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:52 crc kubenswrapper[4669]: I1001 11:30:52.980047 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" podStartSLOduration=122.980014728 podStartE2EDuration="2m2.980014728s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:52.96953644 +0000 UTC m=+144.069101427" watchObservedRunningTime="2025-10-01 11:30:52.980014728 +0000 UTC m=+144.079579745" Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.058588 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.058825 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.558794629 +0000 UTC m=+144.658359606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.058934 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.059588 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.559581348 +0000 UTC m=+144.659146325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.160184 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.160482 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.660444513 +0000 UTC m=+144.760009490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.160603 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.161517 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.661488719 +0000 UTC m=+144.761053736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.261764 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.262326 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.762283242 +0000 UTC m=+144.861848259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.262444 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.262901 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.762885386 +0000 UTC m=+144.862450393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.363508 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.363765 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.86372153 +0000 UTC m=+144.963286547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.465359 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.465929 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:53.965905727 +0000 UTC m=+145.065470704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.565956 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.566172 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.066133646 +0000 UTC m=+145.165698673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.566641 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.567205 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.067186862 +0000 UTC m=+145.166751869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: W1001 11:30:53.636477 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d0ee8e1_4e70_40fe_8780_567c7b49825b.slice/crio-2b5fb6fb3d25081cb6681d019021bddfe1df5bf1dfb19f7633732335522dc1f4 WatchSource:0}: Error finding container 2b5fb6fb3d25081cb6681d019021bddfe1df5bf1dfb19f7633732335522dc1f4: Status 404 returned error can't find the container with id 2b5fb6fb3d25081cb6681d019021bddfe1df5bf1dfb19f7633732335522dc1f4 Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.667475 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.667734 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.167688379 +0000 UTC m=+145.267253386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: W1001 11:30:53.673235 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5fcd20_c6a6_4960_b44b_571ec0f6d8f2.slice/crio-66dff59ab60603089ee875f33cbbafd20bb648126a1f94da2aba50b5412879d6 WatchSource:0}: Error finding container 66dff59ab60603089ee875f33cbbafd20bb648126a1f94da2aba50b5412879d6: Status 404 returned error can't find the container with id 66dff59ab60603089ee875f33cbbafd20bb648126a1f94da2aba50b5412879d6 Oct 01 11:30:53 crc kubenswrapper[4669]: W1001 11:30:53.679660 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9c2ee1_684f_462b_be84_3cae1de6a0da.slice/crio-100ded36849ca8bacd602dbd7d9faaef11fe673dcd4711902603c63dfadaaadc WatchSource:0}: Error finding container 100ded36849ca8bacd602dbd7d9faaef11fe673dcd4711902603c63dfadaaadc: Status 404 returned error can't find the container with id 100ded36849ca8bacd602dbd7d9faaef11fe673dcd4711902603c63dfadaaadc Oct 01 11:30:53 crc kubenswrapper[4669]: W1001 11:30:53.685651 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef90082_710c_48db_81f4_535db9195c2f.slice/crio-73aa1d1de56af1f321d7f46e4fac2ef8074a48f6fcfda61cce114484d3f7af6f WatchSource:0}: Error finding container 73aa1d1de56af1f321d7f46e4fac2ef8074a48f6fcfda61cce114484d3f7af6f: Status 404 returned error can't find the container with id 73aa1d1de56af1f321d7f46e4fac2ef8074a48f6fcfda61cce114484d3f7af6f Oct 01 11:30:53 crc kubenswrapper[4669]: W1001 11:30:53.686292 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd223984b_062e_423a_bfb9_f28dc4dd215b.slice/crio-486eae5cfd01c8a0514b21aa92c32b7ea08442c16a53f1f2459d862b43d844db WatchSource:0}: Error finding container 486eae5cfd01c8a0514b21aa92c32b7ea08442c16a53f1f2459d862b43d844db: Status 404 returned error can't find the container with id 486eae5cfd01c8a0514b21aa92c32b7ea08442c16a53f1f2459d862b43d844db Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.690654 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" event={"ID":"8f196659-a904-4e87-a32c-cae07c3911ea","Type":"ContainerStarted","Data":"787515af2cf14b800b3cdcb7e84f9508300b8b47388575923db80f57adeddb1c"} Oct 01 11:30:53 crc kubenswrapper[4669]: W1001 11:30:53.699799 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf41d7856_7012_4ade_bc8c_8354d6537e9d.slice/crio-ca557e784620aa61f9d14c9eb7cb1f7ef86c200a3d199d513064e8223bbac3e3 WatchSource:0}: Error finding container ca557e784620aa61f9d14c9eb7cb1f7ef86c200a3d199d513064e8223bbac3e3: Status 404 returned error can't find the container with id ca557e784620aa61f9d14c9eb7cb1f7ef86c200a3d199d513064e8223bbac3e3 Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.708851 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" event={"ID":"0debf10a-a4dd-43d7-84fd-3456a2ad1b59","Type":"ContainerStarted","Data":"dcd57c62c22e7a976028b42821bf48e0c6a3d25f9d288aef31f29491304ad50a"} Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.710225 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" event={"ID":"1d0ee8e1-4e70-40fe-8780-567c7b49825b","Type":"ContainerStarted","Data":"2b5fb6fb3d25081cb6681d019021bddfe1df5bf1dfb19f7633732335522dc1f4"} Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.712998 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cclkd" event={"ID":"1467a745-44bf-40c6-a065-5008543d1363","Type":"ContainerStarted","Data":"fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6"} Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.725041 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5bjch" event={"ID":"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2","Type":"ContainerStarted","Data":"66dff59ab60603089ee875f33cbbafd20bb648126a1f94da2aba50b5412879d6"} Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.726194 4669 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sbrcs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.726252 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.770555 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.775582 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.275447312 +0000 UTC m=+145.375012289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.861247 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7q7n5" podStartSLOduration=123.861213155 podStartE2EDuration="2m3.861213155s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:53.851430494 +0000 UTC m=+144.950995471" watchObservedRunningTime="2025-10-01 11:30:53.861213155 +0000 UTC m=+144.960778132" Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.877672 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.878259 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.378230784 +0000 UTC m=+145.477795761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.895028 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-cclkd" podStartSLOduration=123.895008088 podStartE2EDuration="2m3.895008088s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:53.889342548 +0000 UTC m=+144.988907535" watchObservedRunningTime="2025-10-01 11:30:53.895008088 +0000 UTC m=+144.994573065" Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.938840 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv"] Oct 01 11:30:53 crc kubenswrapper[4669]: I1001 11:30:53.980049 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:53 crc kubenswrapper[4669]: E1001 11:30:53.980624 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.480603466 +0000 UTC m=+145.580168443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.082091 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.082388 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.582358613 +0000 UTC m=+145.681923590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.082647 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.082951 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.582944107 +0000 UTC m=+145.682509084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.140905 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zxlg8"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.183668 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.184106 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.684086588 +0000 UTC m=+145.783651565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.204879 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.284426 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.284837 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.78482477 +0000 UTC m=+145.884389747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.309585 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c7c8f4d_66b6_4e26_b7cb_fbf332fc1b1e.slice/crio-5bc7a7b88018419253bb133664216d88385c0bf94853693ec69a170147bcffff WatchSource:0}: Error finding container 5bc7a7b88018419253bb133664216d88385c0bf94853693ec69a170147bcffff: Status 404 returned error can't find the container with id 5bc7a7b88018419253bb133664216d88385c0bf94853693ec69a170147bcffff Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.382646 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.385544 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.385842 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.885819988 +0000 UTC m=+145.985384965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.388349 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.409601 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.412344 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk"] Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.461119 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a86689_3fbe_4668_9fe9_19113485da2f.slice/crio-6684ef40f58af26e7ed16c5542d2b33e8d09c85b97352d7712ddcdd4ce0a59a8 WatchSource:0}: Error finding container 6684ef40f58af26e7ed16c5542d2b33e8d09c85b97352d7712ddcdd4ce0a59a8: Status 404 returned error can't find the container with id 6684ef40f58af26e7ed16c5542d2b33e8d09c85b97352d7712ddcdd4ce0a59a8 Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.462285 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d480e4_07d0_4081_a7b1_9c8568ee449a.slice/crio-004e6002676d2df1c18df621a91e7c43c65c168fd57d15a69ed8f9623a06c347 WatchSource:0}: Error finding container 004e6002676d2df1c18df621a91e7c43c65c168fd57d15a69ed8f9623a06c347: Status 404 returned error can't find the container with id 004e6002676d2df1c18df621a91e7c43c65c168fd57d15a69ed8f9623a06c347 Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.479539 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed84780_32e8_41fe_a20d_4c7a633ee541.slice/crio-21efbfbefc91124dc387c7e3678a72db891feeb8516a2393a85ed4cfc6177508 WatchSource:0}: Error finding container 21efbfbefc91124dc387c7e3678a72db891feeb8516a2393a85ed4cfc6177508: Status 404 returned error can't find the container with id 21efbfbefc91124dc387c7e3678a72db891feeb8516a2393a85ed4cfc6177508 Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.486587 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.487045 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:54.987029802 +0000 UTC m=+146.086594779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.522800 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.572531 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.574810 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4"] Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.585397 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1b934f_dde2_493a_be9e_962e002e3075.slice/crio-60da79597af5a33dcacabc09df8f2ee9300105176f363f34a0a27620140c3667 WatchSource:0}: Error finding container 60da79597af5a33dcacabc09df8f2ee9300105176f363f34a0a27620140c3667: Status 404 returned error can't find the container with id 60da79597af5a33dcacabc09df8f2ee9300105176f363f34a0a27620140c3667 Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.587844 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.588289 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.088255745 +0000 UTC m=+146.187820722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.588460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.588889 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.08887418 +0000 UTC m=+146.188439157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.591887 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3827dd5_b842_4000_8b2e_37f7cc411542.slice/crio-710a9ea32b9040e8382a8d43e71c316105bd2fcee465fe2adaa7e51d2a210b58 WatchSource:0}: Error finding container 710a9ea32b9040e8382a8d43e71c316105bd2fcee465fe2adaa7e51d2a210b58: Status 404 returned error can't find the container with id 710a9ea32b9040e8382a8d43e71c316105bd2fcee465fe2adaa7e51d2a210b58 Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.689091 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.689281 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.189245443 +0000 UTC m=+146.288810420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.689630 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.690058 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.190047603 +0000 UTC m=+146.289612810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.737371 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" event={"ID":"0ed84780-32e8-41fe-a20d-4c7a633ee541","Type":"ContainerStarted","Data":"21efbfbefc91124dc387c7e3678a72db891feeb8516a2393a85ed4cfc6177508"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.744508 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" event={"ID":"f41d7856-7012-4ade-bc8c-8354d6537e9d","Type":"ContainerStarted","Data":"ca557e784620aa61f9d14c9eb7cb1f7ef86c200a3d199d513064e8223bbac3e3"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.784141 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.788828 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" event={"ID":"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3","Type":"ContainerStarted","Data":"fd4e67b22c518bd3b3946f83538b4d05c3254ceffc008d46d5e4b442602756be"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.793410 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.793837 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.293779558 +0000 UTC m=+146.393344525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.794502 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.795005 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.294974828 +0000 UTC m=+146.394539805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.796953 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" event={"ID":"81a86689-3fbe-4668-9fe9-19113485da2f","Type":"ContainerStarted","Data":"6684ef40f58af26e7ed16c5542d2b33e8d09c85b97352d7712ddcdd4ce0a59a8"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.803877 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" event={"ID":"4ef90082-710c-48db-81f4-535db9195c2f","Type":"ContainerStarted","Data":"73aa1d1de56af1f321d7f46e4fac2ef8074a48f6fcfda61cce114484d3f7af6f"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.807440 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dcmjv"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.807489 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" event={"ID":"78380b45-27e9-43cf-8e16-c8c63e0c217f","Type":"ContainerStarted","Data":"706bdd90f4d620c97cc87d751cc43f8bcbb2b6849060c931470db457fc728eff"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.816521 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" event={"ID":"f3827dd5-b842-4000-8b2e-37f7cc411542","Type":"ContainerStarted","Data":"710a9ea32b9040e8382a8d43e71c316105bd2fcee465fe2adaa7e51d2a210b58"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.820060 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.824447 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv"] Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.839349 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" event={"ID":"4f48de82-89d5-4ef8-b5fe-71ef81240421","Type":"ContainerStarted","Data":"0955411afb77a6ceb3a1f8c218deb8274080bc8931d01000b24c070ecfa933f3"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.848391 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" event={"ID":"d223984b-062e-423a-bfb9-f28dc4dd215b","Type":"ContainerStarted","Data":"486eae5cfd01c8a0514b21aa92c32b7ea08442c16a53f1f2459d862b43d844db"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.877470 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" event={"ID":"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e","Type":"ContainerStarted","Data":"5bc7a7b88018419253bb133664216d88385c0bf94853693ec69a170147bcffff"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.881905 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q7cnf" event={"ID":"bc9c2ee1-684f-462b-be84-3cae1de6a0da","Type":"ContainerStarted","Data":"100ded36849ca8bacd602dbd7d9faaef11fe673dcd4711902603c63dfadaaadc"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.887900 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" event={"ID":"fa1b934f-dde2-493a-be9e-962e002e3075","Type":"ContainerStarted","Data":"60da79597af5a33dcacabc09df8f2ee9300105176f363f34a0a27620140c3667"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.889585 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" event={"ID":"4fcb235d-657d-4be7-bbb3-afee58c08df9","Type":"ContainerStarted","Data":"3fd06ff23ff83b5510a0678eea8722e38f9a5be23d8b76589c3654b173f7940d"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.892130 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gmqg9" event={"ID":"cfd95c71-623e-4ee4-aadf-752a8e07d362","Type":"ContainerStarted","Data":"f5592209ef2234dc7792fb7b7253423e6bd60fbaa64665c6a85051b9f1e363d7"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.895127 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:54 crc kubenswrapper[4669]: E1001 11:30:54.895427 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.395410481 +0000 UTC m=+146.494975458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.897282 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" event={"ID":"69d480e4-07d0-4081-a7b1-9c8568ee449a","Type":"ContainerStarted","Data":"004e6002676d2df1c18df621a91e7c43c65c168fd57d15a69ed8f9623a06c347"} Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.897377 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d6022c9_7f75_48fa_98b8_a15e286c85b0.slice/crio-6af0e9afc67b255723255dd15b8b2b8d0a96ecc1743c8fd1f1af198ce561c720 WatchSource:0}: Error finding container 6af0e9afc67b255723255dd15b8b2b8d0a96ecc1743c8fd1f1af198ce561c720: Status 404 returned error can't find the container with id 6af0e9afc67b255723255dd15b8b2b8d0a96ecc1743c8fd1f1af198ce561c720 Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.899951 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" event={"ID":"167650ce-b43a-4e35-93c1-a802838246dd","Type":"ContainerStarted","Data":"0a9b46514338f71b093437d07024c6d34f23f7d7f873b9be512e984d73b50892"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.901434 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x52td" event={"ID":"4dabd582-ba32-4518-920b-4cf38903dffc","Type":"ContainerStarted","Data":"9152ca3c6f982ea84a2adce03ac3c9a0cdb642793df083f14c411f7ee66727ba"} Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.905866 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" event={"ID":"83f83da4-e855-4070-b524-4b7b789d0215","Type":"ContainerStarted","Data":"4967db9275a1f58848f5238c253ce70d19e0ab8b81a097dccce951fda3f7042d"} Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.923307 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35ac61d5_a664_4f55_9a9d_c80e3dc18b16.slice/crio-5c257f282690fb7944a914847efb9c012117bcf7ce8895a7162c7d012d70243c WatchSource:0}: Error finding container 5c257f282690fb7944a914847efb9c012117bcf7ce8895a7162c7d012d70243c: Status 404 returned error can't find the container with id 5c257f282690fb7944a914847efb9c012117bcf7ce8895a7162c7d012d70243c Oct 01 11:30:54 crc kubenswrapper[4669]: I1001 11:30:54.928394 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" podStartSLOduration=124.928370844 podStartE2EDuration="2m4.928370844s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:54.927728317 +0000 UTC m=+146.027293304" watchObservedRunningTime="2025-10-01 11:30:54.928370844 +0000 UTC m=+146.027935831" Oct 01 11:30:54 crc kubenswrapper[4669]: W1001 11:30:54.933381 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b219d54_0074_4283_963c_9f53c7b270fd.slice/crio-c8d2e5bb82ce4a5608cdd4890e21ad016e230bb26849f796d2a3fd3ffe1b2963 WatchSource:0}: Error finding container c8d2e5bb82ce4a5608cdd4890e21ad016e230bb26849f796d2a3fd3ffe1b2963: Status 404 returned error can't find the container with id c8d2e5bb82ce4a5608cdd4890e21ad016e230bb26849f796d2a3fd3ffe1b2963 Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.000020 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.003406 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.503381872 +0000 UTC m=+146.602946859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.102808 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.102998 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.602935984 +0000 UTC m=+146.702500951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.108792 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.109223 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.609195158 +0000 UTC m=+146.708760135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.209540 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.209824 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.709791916 +0000 UTC m=+146.809356893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.210289 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.210788 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.710769961 +0000 UTC m=+146.810334938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.312243 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.312831 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.812805213 +0000 UTC m=+146.912370190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.413853 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.414533 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:55.914495639 +0000 UTC m=+147.014060616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.515402 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.515789 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.015768853 +0000 UTC m=+147.115333830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.617358 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.617849 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.117826757 +0000 UTC m=+147.217391734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.718070 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.718252 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.21821757 +0000 UTC m=+147.317782547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.718401 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.718765 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.218758783 +0000 UTC m=+147.318323760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.820874 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.821304 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.32128243 +0000 UTC m=+147.420847407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.928250 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:55 crc kubenswrapper[4669]: E1001 11:30:55.929297 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.429232289 +0000 UTC m=+147.528797276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.969068 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" event={"ID":"fa1b934f-dde2-493a-be9e-962e002e3075","Type":"ContainerStarted","Data":"ceb5830983ba3a5d7c2bf138a338ed7af3a046892dfeb673a96e8e9aabc11a69"} Oct 01 11:30:55 crc kubenswrapper[4669]: I1001 11:30:55.993353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" event={"ID":"f41d7856-7012-4ade-bc8c-8354d6537e9d","Type":"ContainerStarted","Data":"0fe5ae9a398582ab8be228d63100c962a85d2451997b757b2c37443674ccc1bd"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.029393 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.033149 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" event={"ID":"955a43ae-0efe-491a-8b49-65c5d0251203","Type":"ContainerStarted","Data":"dfa569bb79172bfd4cf9098afe38db506d853f65373bca315b007c1fbdf6f152"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.033196 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" event={"ID":"955a43ae-0efe-491a-8b49-65c5d0251203","Type":"ContainerStarted","Data":"2fb9cf3501a7c1e836f9e2bec616395c973aac577fcb85b8bd82bb1147aac658"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.037896 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" event={"ID":"81a86689-3fbe-4668-9fe9-19113485da2f","Type":"ContainerStarted","Data":"6a39281b6765d6a0ac4cb81c8e7e3f58feb549991f439f835d6f7424efa84987"} Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.038284 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.538244164 +0000 UTC m=+147.637809141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.081504 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwnc7" podStartSLOduration=126.081476388 podStartE2EDuration="2m6.081476388s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.07907028 +0000 UTC m=+147.178635277" watchObservedRunningTime="2025-10-01 11:30:56.081476388 +0000 UTC m=+147.181041365" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.082661 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-44vxs" podStartSLOduration=125.082653098 podStartE2EDuration="2m5.082653098s" podCreationTimestamp="2025-10-01 11:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.032153564 +0000 UTC m=+147.131718531" watchObservedRunningTime="2025-10-01 11:30:56.082653098 +0000 UTC m=+147.182218075" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.093660 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" event={"ID":"83f83da4-e855-4070-b524-4b7b789d0215","Type":"ContainerStarted","Data":"cd8d5019901bd2f600ec1a168ae15f0a317be6e9ca8cb3ece418368020a5282a"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.121059 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" event={"ID":"0ed84780-32e8-41fe-a20d-4c7a633ee541","Type":"ContainerStarted","Data":"b2ef4c3a87df0e2cf56a0fd6b2e309deb960a8c375c556503bb0dbc4459a544d"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.131580 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f48nt" podStartSLOduration=126.123069103 podStartE2EDuration="2m6.123069103s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.119709791 +0000 UTC m=+147.219274778" watchObservedRunningTime="2025-10-01 11:30:56.123069103 +0000 UTC m=+147.222634080" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.139897 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.140288 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.640269837 +0000 UTC m=+147.739834814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.199321 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jfjm9" podStartSLOduration=126.19920805 podStartE2EDuration="2m6.19920805s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.19478582 +0000 UTC m=+147.294350827" watchObservedRunningTime="2025-10-01 11:30:56.19920805 +0000 UTC m=+147.298773027" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.219648 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" event={"ID":"4fcb235d-657d-4be7-bbb3-afee58c08df9","Type":"ContainerStarted","Data":"fdfcd63434e3ed9eaf4ad889dfa5b58077ec93ba9732470e38e6bc104bde8faa"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.245882 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.251819 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.751792274 +0000 UTC m=+147.851357251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.279622 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" podStartSLOduration=56.279595389 podStartE2EDuration="56.279595389s" podCreationTimestamp="2025-10-01 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.237203245 +0000 UTC m=+147.336768232" watchObservedRunningTime="2025-10-01 11:30:56.279595389 +0000 UTC m=+147.379160366" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.283466 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" event={"ID":"69d480e4-07d0-4081-a7b1-9c8568ee449a","Type":"ContainerStarted","Data":"c14d5a036884171e77a75076f7d6d18bd997a6d6d2728aacb0564484b233b490"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.325219 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" event={"ID":"d223984b-062e-423a-bfb9-f28dc4dd215b","Type":"ContainerStarted","Data":"b54e48f122e8834996318c455a5ae46d969060fa85b4b0f65e1ccfc4669f49a5"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.326594 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.350945 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.353057 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.853036788 +0000 UTC m=+147.952601765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.355225 4669 patch_prober.go:28] interesting pod/console-operator-58897d9998-qqhrp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.355295 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" podUID="d223984b-062e-423a-bfb9-f28dc4dd215b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.358452 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" event={"ID":"00e38d1c-41cd-437a-a6e9-3a53fe903c11","Type":"ContainerStarted","Data":"24ea4bc5be2cede4fc2c2f6375f37c5fc7b117984a1919bb102d1a7b7c80ffad"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.375565 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v5z8k" podStartSLOduration=126.375539564 podStartE2EDuration="2m6.375539564s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.284529821 +0000 UTC m=+147.384094798" watchObservedRunningTime="2025-10-01 11:30:56.375539564 +0000 UTC m=+147.475104541" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.385231 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" podStartSLOduration=126.385205041 podStartE2EDuration="2m6.385205041s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.3823355 +0000 UTC m=+147.481900477" watchObservedRunningTime="2025-10-01 11:30:56.385205041 +0000 UTC m=+147.484770018" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.403446 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hdrzq" event={"ID":"54979db4-1c85-4bfd-aec1-c154590ec33b","Type":"ContainerStarted","Data":"51a63a4d29c240869363fa77a1163ef0aab8a703d656590d7173d36995488338"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.404912 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.413220 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q7cnf" event={"ID":"bc9c2ee1-684f-462b-be84-3cae1de6a0da","Type":"ContainerStarted","Data":"fcade39d8ea131c748dc5f7ed0a1fdaf902128079634b497273a521ed54cebce"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.432282 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.432369 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.461480 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.464408 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.964379342 +0000 UTC m=+148.063944319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.465856 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.466339 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" event={"ID":"ef2d43d6-2138-4c6f-9bd1-09a621ebda8c","Type":"ContainerStarted","Data":"bb3cebdb67fa3293de59ca36f79826f79549b53c05836a0b0b4e1547e0b23f10"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.466538 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" podStartSLOduration=126.466509434 podStartE2EDuration="2m6.466509434s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.46390654 +0000 UTC m=+147.563471517" watchObservedRunningTime="2025-10-01 11:30:56.466509434 +0000 UTC m=+147.566074411" Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.470473 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:56.970453781 +0000 UTC m=+148.070018758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.514020 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" event={"ID":"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e","Type":"ContainerStarted","Data":"96dedf69cab3e5c63665679163196c5e88237b7badf21157df880917b080dc9a"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.514224 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hdrzq" podStartSLOduration=126.5142137 podStartE2EDuration="2m6.5142137s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.508934269 +0000 UTC m=+147.608499256" watchObservedRunningTime="2025-10-01 11:30:56.5142137 +0000 UTC m=+147.613778677" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.573216 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.574051 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.074023412 +0000 UTC m=+148.173588379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.574179 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.575025 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" event={"ID":"d3392045-54bc-4b2a-a1f8-b7ac9f0d145b","Type":"ContainerStarted","Data":"cf0b6003eb7801a42f29bce322897f982965de5d376f32b0727feeb3ebef718d"} Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.575392 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.075381686 +0000 UTC m=+148.174946663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.596946 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k852r" podStartSLOduration=126.596928487 podStartE2EDuration="2m6.596928487s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.579860356 +0000 UTC m=+147.679425333" watchObservedRunningTime="2025-10-01 11:30:56.596928487 +0000 UTC m=+147.696493464" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.597255 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q7cnf" podStartSLOduration=8.597249525 podStartE2EDuration="8.597249525s" podCreationTimestamp="2025-10-01 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.556681426 +0000 UTC m=+147.656246403" watchObservedRunningTime="2025-10-01 11:30:56.597249525 +0000 UTC m=+147.696814502" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.597633 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" event={"ID":"06fa5e25-562e-4bde-96d5-c0877aa235f7","Type":"ContainerStarted","Data":"92486e26f1e7c507c9685535cf8c397081c5ac616b9b70da1d28776253715235"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.604718 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x52td" event={"ID":"4dabd582-ba32-4518-920b-4cf38903dffc","Type":"ContainerStarted","Data":"5dfcfdf8c7c2a78903ec16d5282aaeecf02b9782b2deca918f1f94860c002a22"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.616571 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zp57s" podStartSLOduration=126.616552861 podStartE2EDuration="2m6.616552861s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.615056084 +0000 UTC m=+147.714621061" watchObservedRunningTime="2025-10-01 11:30:56.616552861 +0000 UTC m=+147.716117838" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.634378 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" event={"ID":"4c8e059d-1baa-4142-a6e9-af3c9bfe16d3","Type":"ContainerStarted","Data":"c9423ec50bbe1a68a44bc7bddda5ae99e3a8b4e9ae66191c405a5a32bcbad5e8"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.652820 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x52td" podStartSLOduration=8.652790303 podStartE2EDuration="8.652790303s" podCreationTimestamp="2025-10-01 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.64493963 +0000 UTC m=+147.744504607" watchObservedRunningTime="2025-10-01 11:30:56.652790303 +0000 UTC m=+147.752355280" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.667010 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" event={"ID":"35ac61d5-a664-4f55-9a9d-c80e3dc18b16","Type":"ContainerStarted","Data":"5c257f282690fb7944a914847efb9c012117bcf7ce8895a7162c7d012d70243c"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.667351 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.676206 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dcmjv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.676263 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.676978 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.679821 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.179794048 +0000 UTC m=+148.279359025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.682781 4669 generic.go:334] "Generic (PLEG): container finished" podID="8e2e100b-8917-4730-83b1-2fc7716f740b" containerID="7856dc335613d36890f758ae06046121c30ee57a7c6fef1c9a2d2b5cc24f8ea9" exitCode=0 Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.682854 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" event={"ID":"8e2e100b-8917-4730-83b1-2fc7716f740b","Type":"ContainerDied","Data":"7856dc335613d36890f758ae06046121c30ee57a7c6fef1c9a2d2b5cc24f8ea9"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.696444 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" podStartSLOduration=126.696429258 podStartE2EDuration="2m6.696429258s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.690331277 +0000 UTC m=+147.789896254" watchObservedRunningTime="2025-10-01 11:30:56.696429258 +0000 UTC m=+147.795994235" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.704825 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" event={"ID":"1bf82d78-0b71-43b4-b6d3-babe39dd328e","Type":"ContainerStarted","Data":"a1ff0d2effb77dad9c99bef017b3b3a2335134c2a8fbef6c2c33ed6ec0f245da"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.708660 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gmqg9" event={"ID":"cfd95c71-623e-4ee4-aadf-752a8e07d362","Type":"ContainerStarted","Data":"f99ccae4f64129c19ccf14dd32491e79025d420feab03399b9804710a0386043"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.728099 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" event={"ID":"78380b45-27e9-43cf-8e16-c8c63e0c217f","Type":"ContainerStarted","Data":"d6e247d0e6bf69ebb632f73ed732cf47047d8a6e08f017286969cba171cb23c3"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.733456 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" event={"ID":"167650ce-b43a-4e35-93c1-a802838246dd","Type":"ContainerStarted","Data":"0ea6069256f1a977c28472ad8f2d98876303a5bf393e1a8cbe5d2aca2a68d1af"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.764041 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" event={"ID":"4f48de82-89d5-4ef8-b5fe-71ef81240421","Type":"ContainerStarted","Data":"318ed88823ffe3e31b408ba4de8980ac1c2496fa56e07065e2fd231d773ef909"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.764756 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.778274 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.781155 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.281116984 +0000 UTC m=+148.380681961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.783017 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" event={"ID":"1b219d54-0074-4283-963c-9f53c7b270fd","Type":"ContainerStarted","Data":"17ce8c634a47053e92fd0381e4f8d326d8d2259868e05417c0b1d63352057ed4"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.783052 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" event={"ID":"1b219d54-0074-4283-963c-9f53c7b270fd","Type":"ContainerStarted","Data":"c8d2e5bb82ce4a5608cdd4890e21ad016e230bb26849f796d2a3fd3ffe1b2963"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.783935 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.788206 4669 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-czlnv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.788266 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" podUID="1b219d54-0074-4283-963c-9f53c7b270fd" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.790741 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" event={"ID":"5d6022c9-7f75-48fa-98b8-a15e286c85b0","Type":"ContainerStarted","Data":"a4e20ee48def0e3db23339b4f6556d5edb4c019169dd8584aaf6676076a49d7d"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.790777 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" event={"ID":"5d6022c9-7f75-48fa-98b8-a15e286c85b0","Type":"ContainerStarted","Data":"6af0e9afc67b255723255dd15b8b2b8d0a96ecc1743c8fd1f1af198ce561c720"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.791605 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" podStartSLOduration=126.791592983 podStartE2EDuration="2m6.791592983s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.788775763 +0000 UTC m=+147.888340740" watchObservedRunningTime="2025-10-01 11:30:56.791592983 +0000 UTC m=+147.891157960" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.795302 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.798796 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.798843 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.809504 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" event={"ID":"0debf10a-a4dd-43d7-84fd-3456a2ad1b59","Type":"ContainerStarted","Data":"4c547f326e3236fb4d8e10d665fcc4c75fda577894c4cf738d990dc7263a23c6"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.814384 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" event={"ID":"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e","Type":"ContainerStarted","Data":"faf4d155e6b0f52ff5563396b5e941ee5df12f9b7e30b42c09f7e7e4837afc4c"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.815242 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.820793 4669 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8hc7m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.820840 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.828684 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" podStartSLOduration=126.828653385 podStartE2EDuration="2m6.828653385s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.827485266 +0000 UTC m=+147.927050243" watchObservedRunningTime="2025-10-01 11:30:56.828653385 +0000 UTC m=+147.928218362" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.837530 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" event={"ID":"4f9513dc-5114-4ec2-81ec-d86a31c3635b","Type":"ContainerStarted","Data":"7b4ed1b2db80c06b06ca7701bc58d9ff94b92d5e01f1944f1ffbef6cecc561f4"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.841130 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" event={"ID":"796ccf8d-b179-440a-87a8-c6de61d08d4a","Type":"ContainerStarted","Data":"8a58b5630a0acc1eb02e510ac4ad4935938c8b41a767cbc13404f5e726571592"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.842309 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" event={"ID":"c2afea9b-5446-4746-86f9-db70b4916992","Type":"ContainerStarted","Data":"f26424eeb102a9f9826f3efb25c50a22fdcbaca8c7ccb3dce6adcb0f4a788971"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.842919 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.843895 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" event={"ID":"b5100377-ee4b-4427-9106-eea735423f5a","Type":"ContainerStarted","Data":"c59f9ea783df00bcedf898eeaeef9ed0b7329e6c4cf9e8bac00c0cefd55d3886"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.844378 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.844963 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5bjch" event={"ID":"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2","Type":"ContainerStarted","Data":"194778860b740939b74f646e5aa917c321fd09078ae57ee58c9c4347049bbc61"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.847618 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" event={"ID":"f3827dd5-b842-4000-8b2e-37f7cc411542","Type":"ContainerStarted","Data":"e69cbe2d2794e4399c3f4c2ad7d82bb6ea93bfd685bef7ac839a736bba51283b"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.848953 4669 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2r4jf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.848998 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" podUID="b5100377-ee4b-4427-9106-eea735423f5a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.849111 4669 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sfgk6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.849165 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" podUID="c2afea9b-5446-4746-86f9-db70b4916992" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.849584 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" event={"ID":"4ef90082-710c-48db-81f4-535db9195c2f","Type":"ContainerStarted","Data":"c7ceb120b40def5c251962ddf5c132daa04f5dc3120979c0fd46654bc01bf30e"} Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.849892 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.850829 4669 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6nqdz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.850868 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" podUID="4ef90082-710c-48db-81f4-535db9195c2f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.881033 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.882575 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.382557214 +0000 UTC m=+148.482122191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.904947 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" podStartSLOduration=126.904921715 podStartE2EDuration="2m6.904921715s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.867635536 +0000 UTC m=+147.967200513" watchObservedRunningTime="2025-10-01 11:30:56.904921715 +0000 UTC m=+148.004486692" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.907116 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vhvv" podStartSLOduration=126.907108438 podStartE2EDuration="2m6.907108438s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.896418385 +0000 UTC m=+147.995983362" watchObservedRunningTime="2025-10-01 11:30:56.907108438 +0000 UTC m=+148.006673415" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.926908 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5d22b" podStartSLOduration=126.926885085 podStartE2EDuration="2m6.926885085s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.924661311 +0000 UTC m=+148.024226288" watchObservedRunningTime="2025-10-01 11:30:56.926885085 +0000 UTC m=+148.026450062" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.961914 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gmqg9" podStartSLOduration=126.961888977 podStartE2EDuration="2m6.961888977s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:56.959090248 +0000 UTC m=+148.058655225" watchObservedRunningTime="2025-10-01 11:30:56.961888977 +0000 UTC m=+148.061453954" Oct 01 11:30:56 crc kubenswrapper[4669]: I1001 11:30:56.990852 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:56 crc kubenswrapper[4669]: E1001 11:30:56.991409 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.491389084 +0000 UTC m=+148.590954061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.030494 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" podStartSLOduration=127.030466817 podStartE2EDuration="2m7.030466817s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.004481797 +0000 UTC m=+148.104046774" watchObservedRunningTime="2025-10-01 11:30:57.030466817 +0000 UTC m=+148.130031794" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.032559 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" podStartSLOduration=127.032553258 podStartE2EDuration="2m7.032553258s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.029933684 +0000 UTC m=+148.129498661" watchObservedRunningTime="2025-10-01 11:30:57.032553258 +0000 UTC m=+148.132118235" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.091978 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.092172 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.592137956 +0000 UTC m=+148.691702933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.092505 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.092915 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.592899805 +0000 UTC m=+148.692464782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.107046 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" podStartSLOduration=127.107017223 podStartE2EDuration="2m7.107017223s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.10286217 +0000 UTC m=+148.202427147" watchObservedRunningTime="2025-10-01 11:30:57.107017223 +0000 UTC m=+148.206582200" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.193782 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.194005 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.693969274 +0000 UTC m=+148.793534251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.194615 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.195005 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.69498604 +0000 UTC m=+148.794551017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.208476 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-x95jd" podStartSLOduration=127.208452101 podStartE2EDuration="2m7.208452101s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.201054509 +0000 UTC m=+148.300619486" watchObservedRunningTime="2025-10-01 11:30:57.208452101 +0000 UTC m=+148.308017078" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.295678 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.295921 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.795881335 +0000 UTC m=+148.895446312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.296105 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.296508 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.79649175 +0000 UTC m=+148.896056727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.306806 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" podStartSLOduration=127.306785714 podStartE2EDuration="2m7.306785714s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.304788334 +0000 UTC m=+148.404353311" watchObservedRunningTime="2025-10-01 11:30:57.306785714 +0000 UTC m=+148.406350691" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.335748 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" podStartSLOduration=127.335727246 podStartE2EDuration="2m7.335727246s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.33503377 +0000 UTC m=+148.434598777" watchObservedRunningTime="2025-10-01 11:30:57.335727246 +0000 UTC m=+148.435292223" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.369741 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wn6cw" podStartSLOduration=127.369717574 podStartE2EDuration="2m7.369717574s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.368185265 +0000 UTC m=+148.467750242" watchObservedRunningTime="2025-10-01 11:30:57.369717574 +0000 UTC m=+148.469282551" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.397296 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.397532 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.897494558 +0000 UTC m=+148.997059535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.398041 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.398443 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.898425601 +0000 UTC m=+148.997990578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.499318 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.499511 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:57.99947771 +0000 UTC m=+149.099042677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.499715 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.500107 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.000098936 +0000 UTC m=+149.099664003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.601173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.601415 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.10136969 +0000 UTC m=+149.200934667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.601890 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.602331 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.102320764 +0000 UTC m=+149.201885821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.703764 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.704011 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.203967647 +0000 UTC m=+149.303532624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.704069 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.704126 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.704180 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.704211 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.704242 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.704571 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.204557772 +0000 UTC m=+149.304122749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.709064 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.712012 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.726913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.732037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.761391 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.778013 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.805039 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.805481 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.305461628 +0000 UTC m=+149.405026605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.812001 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:30:57 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:30:57 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:30:57 crc kubenswrapper[4669]: healthz check failed Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.812101 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.873085 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p62sv" event={"ID":"0debf10a-a4dd-43d7-84fd-3456a2ad1b59","Type":"ContainerStarted","Data":"723212fdc661992f22497cb4e5e0629d4b587152190be7714fa620b583d4385c"} Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.900290 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" event={"ID":"69d480e4-07d0-4081-a7b1-9c8568ee449a","Type":"ContainerStarted","Data":"9e7daa24f0b575b6748de07e37c60070a17a882582ece5d8aa018f79a66cc720"} Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.908951 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:57 crc kubenswrapper[4669]: E1001 11:30:57.909560 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.409538921 +0000 UTC m=+149.509103898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.925572 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" podStartSLOduration=127.925554316 podStartE2EDuration="2m7.925554316s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.39674545 +0000 UTC m=+148.496310427" watchObservedRunningTime="2025-10-01 11:30:57.925554316 +0000 UTC m=+149.025119293" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.926610 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-k7fz4" podStartSLOduration=127.926604612 podStartE2EDuration="2m7.926604612s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:57.923230139 +0000 UTC m=+149.022795116" watchObservedRunningTime="2025-10-01 11:30:57.926604612 +0000 UTC m=+149.026169589" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.931123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" event={"ID":"35ac61d5-a664-4f55-9a9d-c80e3dc18b16","Type":"ContainerStarted","Data":"c216b01b6f61a6ac468e6d059a1b07aaaa48e711f9c563e80082200537d9de26"} Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.936619 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dcmjv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.936706 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.952837 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" event={"ID":"78380b45-27e9-43cf-8e16-c8c63e0c217f","Type":"ContainerStarted","Data":"68e325f3453da15fcffabbd9e1f3f5a7bb60af443b32c5e0bb57433f684133d8"} Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.976415 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" event={"ID":"6c7c8f4d-66b6-4e26-b7cb-fbf332fc1b1e","Type":"ContainerStarted","Data":"63e9af778f63b79da44505539575e3205a6f296fe2fc79879917b7ba04b63fc1"} Oct 01 11:30:57 crc kubenswrapper[4669]: I1001 11:30:57.989557 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.009856 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.010809 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.510772245 +0000 UTC m=+149.610337242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.040819 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhwl4" podStartSLOduration=128.040792935 podStartE2EDuration="2m8.040792935s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.006824478 +0000 UTC m=+149.106389455" watchObservedRunningTime="2025-10-01 11:30:58.040792935 +0000 UTC m=+149.140357912" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.042900 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zxlg8" podStartSLOduration=128.042888256 podStartE2EDuration="2m8.042888256s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.041576004 +0000 UTC m=+149.141140981" watchObservedRunningTime="2025-10-01 11:30:58.042888256 +0000 UTC m=+149.142453243" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.054118 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" event={"ID":"8e2e100b-8917-4730-83b1-2fc7716f740b","Type":"ContainerStarted","Data":"2afe2eeb945417e1eca61b56ecbf5dae6f7eee3e0fa12f7084096f4ff6b9ac34"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.055023 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.091732 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" podStartSLOduration=128.091705969 podStartE2EDuration="2m8.091705969s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.090525779 +0000 UTC m=+149.190090756" watchObservedRunningTime="2025-10-01 11:30:58.091705969 +0000 UTC m=+149.191270946" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.120589 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.121311 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.621287947 +0000 UTC m=+149.720852924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.163621 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5bjch" event={"ID":"dc5fcd20-c6a6-4960-b44b-571ec0f6d8f2","Type":"ContainerStarted","Data":"7a9795dfb6c2c932d4355ccafb33e6b0b3aefcc1b7c75415e6b7f9833ab69c0d"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.164804 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5bjch" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.177787 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6vbrp" event={"ID":"06fa5e25-562e-4bde-96d5-c0877aa235f7","Type":"ContainerStarted","Data":"a17cc62043710fbc8403d1ed6f36ea550da2bde875859fe4bc19c0889336277f"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.200009 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" event={"ID":"1bf82d78-0b71-43b4-b6d3-babe39dd328e","Type":"ContainerStarted","Data":"1320b8b97c3d8a26935a44d7828b48ecd81fff54ddcb8fa3869ce1bb4d4f6eeb"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.210044 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5bjch" podStartSLOduration=10.210020324 podStartE2EDuration="10.210020324s" podCreationTimestamp="2025-10-01 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.208369483 +0000 UTC m=+149.307934460" watchObservedRunningTime="2025-10-01 11:30:58.210020324 +0000 UTC m=+149.309585301" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.218482 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" event={"ID":"1d0ee8e1-4e70-40fe-8780-567c7b49825b","Type":"ContainerStarted","Data":"8dbcf26765ac0cb22a59f9ba7ae2660ed715ef33452d647b156169077760a80b"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.222637 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.222877 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.72284558 +0000 UTC m=+149.822410557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.223126 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.224719 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.724697855 +0000 UTC m=+149.824262832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.236364 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9x9hq" podStartSLOduration=128.236344183 podStartE2EDuration="2m8.236344183s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.236130896 +0000 UTC m=+149.335695863" watchObservedRunningTime="2025-10-01 11:30:58.236344183 +0000 UTC m=+149.335909160" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.258422 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" event={"ID":"f3827dd5-b842-4000-8b2e-37f7cc411542","Type":"ContainerStarted","Data":"7989bac829674d6eda02009bf1b3495de06c281f3b3a2f3210d5b28f7f53c222"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.263639 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" event={"ID":"4f48de82-89d5-4ef8-b5fe-71ef81240421","Type":"ContainerStarted","Data":"437e56ac793c3967070bab626ce941c44040dad43301404600789d936f8cbba2"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.270122 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-56qlt" event={"ID":"00e38d1c-41cd-437a-a6e9-3a53fe903c11","Type":"ContainerStarted","Data":"8c027a757ffef13ecea4aff562b9eaaed72d1d48ac5b259d7a1714587541425a"} Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.270910 4669 patch_prober.go:28] interesting pod/console-operator-58897d9998-qqhrp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.271027 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" podUID="d223984b-062e-423a-bfb9-f28dc4dd215b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.274509 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.274536 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.308894 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6nqdz" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.316101 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sbx7t" podStartSLOduration=128.316059515 podStartE2EDuration="2m8.316059515s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.295809217 +0000 UTC m=+149.395374214" watchObservedRunningTime="2025-10-01 11:30:58.316059515 +0000 UTC m=+149.415624492" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.339246 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.340575 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-czlnv" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.349842 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.849811378 +0000 UTC m=+149.949376355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.351787 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.352553 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.852543675 +0000 UTC m=+149.952108652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.364563 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dqh7b" podStartSLOduration=128.36452689 podStartE2EDuration="2m8.36452689s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:30:58.36169149 +0000 UTC m=+149.461256467" watchObservedRunningTime="2025-10-01 11:30:58.36452689 +0000 UTC m=+149.464091867" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.477762 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.482701 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:58.98267305 +0000 UTC m=+150.082238027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.579667 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.580148 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.080132471 +0000 UTC m=+150.179697448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.680717 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.681119 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.181097638 +0000 UTC m=+150.280662615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.750070 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.783219 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.783835 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.283813229 +0000 UTC m=+150.383378206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.803477 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:30:58 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:30:58 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:30:58 crc kubenswrapper[4669]: healthz check failed Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.803554 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.884860 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.885343 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.385321479 +0000 UTC m=+150.484886456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:58 crc kubenswrapper[4669]: I1001 11:30:58.987531 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:58 crc kubenswrapper[4669]: E1001 11:30:58.988201 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.488186684 +0000 UTC m=+150.587751661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.088899 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.089124 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.589089079 +0000 UTC m=+150.688654056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.089279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.089656 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.589647723 +0000 UTC m=+150.689212780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.190701 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.190943 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.690907947 +0000 UTC m=+150.790472924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.191042 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.191434 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.69142654 +0000 UTC m=+150.790991517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.271862 4669 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8hc7m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.271948 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.272266 4669 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sfgk6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.272335 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" podUID="c2afea9b-5446-4746-86f9-db70b4916992" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.277185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e6c76fe2350b73a7a535bee4778160f426c2ce65eeb602eb135b599f1bc2fb63"} Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.277262 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b02951805a0067d94f1e3122c5d6643d4cd463ed81d3f2aeda9e453a1cb0afad"} Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.278881 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d158e0450082d6207081701de9de12ed14904fd02fcf866ea25178ae0bf3e971"} Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.278935 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7495adbab47a210b7dec526ea527f1f7edd115a04d2ae4eaf5ab6fb4346e5566"} Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.283094 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d47e9e67b6e9485d88fa246c7db11a2452ae10dee3726eb3db01b780b15999e4"} Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.283163 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"958b25c35aca7a5b6e80fe542f768a3d0ed33dadfe50b6b933703796075204c0"} Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.283723 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.284989 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dcmjv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.285038 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.285921 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.285969 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.295491 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.296057 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.796037386 +0000 UTC m=+150.895602363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.397452 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.405280 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.905257617 +0000 UTC m=+151.004822584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.499068 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.499347 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.999304565 +0000 UTC m=+151.098869542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.499414 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.499918 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:30:59.999911299 +0000 UTC m=+151.099476276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.600344 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.600591 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.100551438 +0000 UTC m=+151.200116415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.600694 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.600995 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.100977829 +0000 UTC m=+151.200542806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.702462 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.702709 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.202676004 +0000 UTC m=+151.302240981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.702771 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.703173 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.203163806 +0000 UTC m=+151.302728773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.735345 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.752160 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.752301 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.783307 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.784559 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.797986 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.800947 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:30:59 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:30:59 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:30:59 crc kubenswrapper[4669]: healthz check failed Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.801011 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.806557 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.807910 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.307890446 +0000 UTC m=+151.407455423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.909907 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:30:59 crc kubenswrapper[4669]: E1001 11:30:59.911111 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.411092899 +0000 UTC m=+151.510657876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:30:59 crc kubenswrapper[4669]: I1001 11:30:59.920302 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sfgk6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.010787 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.011032 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.510980049 +0000 UTC m=+151.610545026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.011102 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.011507 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.511499421 +0000 UTC m=+151.611064398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.029701 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qqhrp" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.113181 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.113401 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.613366241 +0000 UTC m=+151.712931358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.149931 4669 patch_prober.go:28] interesting pod/apiserver-76f77b778f-flpxd container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]log ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]etcd ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/max-in-flight-filter ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 01 11:31:00 crc kubenswrapper[4669]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 01 11:31:00 crc kubenswrapper[4669]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/project.openshift.io-projectcache ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/openshift.io-startinformers ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 01 11:31:00 crc kubenswrapper[4669]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 11:31:00 crc kubenswrapper[4669]: livez check failed Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.150016 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" podUID="4c8e059d-1baa-4142-a6e9-af3c9bfe16d3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.211004 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n2l4s"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.212021 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.214445 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.214828 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.71480748 +0000 UTC m=+151.814372457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.221584 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.242268 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2l4s"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.286560 4669 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8hc7m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.286657 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.310630 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n7wk4" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.316219 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.316492 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tccjw\" (UniqueName: \"kubernetes.io/projected/0b8d4849-98dd-4b1b-90dc-9151e8b17224-kube-api-access-tccjw\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.316558 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-utilities\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.316593 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-catalog-content\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.316719 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.81670243 +0000 UTC m=+151.916267407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.353124 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2qkmt" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.412842 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hwdf6"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.413920 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.417636 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-utilities\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.417704 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-catalog-content\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.417804 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.418038 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tccjw\" (UniqueName: \"kubernetes.io/projected/0b8d4849-98dd-4b1b-90dc-9151e8b17224-kube-api-access-tccjw\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: W1001 11:31:00.420514 4669 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.420552 4669 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.426858 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:00.926835873 +0000 UTC m=+152.026400850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.427189 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-catalog-content\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.427439 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-utilities\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.487864 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tccjw\" (UniqueName: \"kubernetes.io/projected/0b8d4849-98dd-4b1b-90dc-9151e8b17224-kube-api-access-tccjw\") pod \"certified-operators-n2l4s\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.489344 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwdf6"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.519580 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.520175 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-catalog-content\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.520283 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-utilities\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.520413 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xktr\" (UniqueName: \"kubernetes.io/projected/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-kube-api-access-4xktr\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.520595 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.020572922 +0000 UTC m=+152.120137899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.528598 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.583816 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p69c8"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.585054 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.608263 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p69c8"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621683 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xktr\" (UniqueName: \"kubernetes.io/projected/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-kube-api-access-4xktr\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621741 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621776 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-catalog-content\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621796 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-utilities\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621830 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-utilities\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621855 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-catalog-content\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.621903 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwf2w\" (UniqueName: \"kubernetes.io/projected/7b67d961-10c4-45b0-84e0-99ff6afc366a-kube-api-access-qwf2w\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.622675 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.122661847 +0000 UTC m=+152.222226824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.622729 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-utilities\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.623025 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-catalog-content\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.646264 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xktr\" (UniqueName: \"kubernetes.io/projected/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-kube-api-access-4xktr\") pod \"community-operators-hwdf6\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.722881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.723266 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwf2w\" (UniqueName: \"kubernetes.io/projected/7b67d961-10c4-45b0-84e0-99ff6afc366a-kube-api-access-qwf2w\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.723333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-utilities\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.723361 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-catalog-content\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.724238 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.224203568 +0000 UTC m=+152.323768545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.724361 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-catalog-content\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.724438 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-utilities\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.757057 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwf2w\" (UniqueName: \"kubernetes.io/projected/7b67d961-10c4-45b0-84e0-99ff6afc366a-kube-api-access-qwf2w\") pod \"certified-operators-p69c8\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.788492 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk6f2"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.789978 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.808115 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:00 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:00 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:00 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.808563 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.812721 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk6f2"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.827015 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl82v\" (UniqueName: \"kubernetes.io/projected/356dfcd7-c70a-4494-aeed-89aa3393ecd9-kube-api-access-sl82v\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.827097 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-catalog-content\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.827382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.827441 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-utilities\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.827950 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.327934664 +0000 UTC m=+152.427499641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.907052 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2l4s"] Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.921383 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.928973 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.929227 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-catalog-content\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.929313 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.42925307 +0000 UTC m=+152.528818047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.929392 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.929463 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-utilities\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.929644 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl82v\" (UniqueName: \"kubernetes.io/projected/356dfcd7-c70a-4494-aeed-89aa3393ecd9-kube-api-access-sl82v\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.929790 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-catalog-content\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: E1001 11:31:00.929971 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.429949507 +0000 UTC m=+152.529514484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.930460 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-utilities\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:00 crc kubenswrapper[4669]: I1001 11:31:00.958242 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl82v\" (UniqueName: \"kubernetes.io/projected/356dfcd7-c70a-4494-aeed-89aa3393ecd9-kube-api-access-sl82v\") pod \"community-operators-kk6f2\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.031585 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.032014 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.53197596 +0000 UTC m=+152.631540937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.032169 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.032511 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.532502534 +0000 UTC m=+152.632067511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.133210 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.133581 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.633542683 +0000 UTC m=+152.733107720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.235846 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.236374 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.736351115 +0000 UTC m=+152.835916092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.294106 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.294182 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.296993 4669 patch_prober.go:28] interesting pod/console-f9d7485db-cclkd container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.297067 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cclkd" podUID="1467a745-44bf-40c6-a065-5008543d1363" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.328449 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerStarted","Data":"92c4bcccbd45d80a42b780568943b89f999ba61ad747ccd6f9f832e4ae826559"} Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.339720 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.339976 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.839928827 +0000 UTC m=+152.939493814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.340218 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.340732 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.840709446 +0000 UTC m=+152.940274423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.345269 4669 generic.go:334] "Generic (PLEG): container finished" podID="0ed84780-32e8-41fe-a20d-4c7a633ee541" containerID="b2ef4c3a87df0e2cf56a0fd6b2e309deb960a8c375c556503bb0dbc4459a544d" exitCode=0 Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.345360 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" event={"ID":"0ed84780-32e8-41fe-a20d-4c7a633ee541","Type":"ContainerDied","Data":"b2ef4c3a87df0e2cf56a0fd6b2e309deb960a8c375c556503bb0dbc4459a544d"} Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.349140 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" event={"ID":"1d0ee8e1-4e70-40fe-8780-567c7b49825b","Type":"ContainerStarted","Data":"d20b1d84a388adc0ac30101ea09f8bb6fe571079137119fc6bb9aa702f1b7306"} Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.378429 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p69c8"] Oct 01 11:31:01 crc kubenswrapper[4669]: W1001 11:31:01.392101 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b67d961_10c4_45b0_84e0_99ff6afc366a.slice/crio-1a1378ed20cc8ad4f4a8a2333583980af216de872bef1580804b2a41e48152ed WatchSource:0}: Error finding container 1a1378ed20cc8ad4f4a8a2333583980af216de872bef1580804b2a41e48152ed: Status 404 returned error can't find the container with id 1a1378ed20cc8ad4f4a8a2333583980af216de872bef1580804b2a41e48152ed Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.410499 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.410566 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.410596 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.410647 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.411698 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.418837 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.419538 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.444646 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.444892 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.944854521 +0000 UTC m=+153.044419498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.445279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.448232 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:01.948206654 +0000 UTC m=+153.047771631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.452664 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.454341 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.457539 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.457743 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.457829 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.459759 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.559521 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.559802 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.059768702 +0000 UTC m=+153.159333679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.560519 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.560763 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.560807 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.562252 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.062228993 +0000 UTC m=+153.161793970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.665181 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.665332 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.165305223 +0000 UTC m=+153.264870190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.665528 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.665581 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.665609 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.665698 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.665871 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.165863296 +0000 UTC m=+153.265428273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.681228 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwdf6"] Oct 01 11:31:01 crc kubenswrapper[4669]: W1001 11:31:01.697440 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded0742b0_d8c4_4b2a_b0d1_2ea0bb44d5e0.slice/crio-2b0c25ff2a9c9c70be0f2e6a4e3bd7e2a2c2949e284f37b75ff5dfe2e491bb49 WatchSource:0}: Error finding container 2b0c25ff2a9c9c70be0f2e6a4e3bd7e2a2c2949e284f37b75ff5dfe2e491bb49: Status 404 returned error can't find the container with id 2b0c25ff2a9c9c70be0f2e6a4e3bd7e2a2c2949e284f37b75ff5dfe2e491bb49 Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.702131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.762429 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk6f2"] Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.767284 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.767535 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.26750411 +0000 UTC m=+153.367069087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.767693 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.768047 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.268030093 +0000 UTC m=+153.367595070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.794876 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.801194 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:01 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:01 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:01 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.801291 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.860153 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.863941 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.864104 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.868932 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.870837 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.370805994 +0000 UTC m=+153.470370971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.948964 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:31:01 crc kubenswrapper[4669]: I1001 11:31:01.977133 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:01 crc kubenswrapper[4669]: E1001 11:31:01.977584 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.477565555 +0000 UTC m=+153.577130542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.078373 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.082525 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.582483898 +0000 UTC m=+153.682048875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.185565 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.186041 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.686022839 +0000 UTC m=+153.785587816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.192990 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.286863 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.287064 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.787029388 +0000 UTC m=+153.886594365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.287163 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.287561 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.78754482 +0000 UTC m=+153.887109797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.356562 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c","Type":"ContainerStarted","Data":"37edec748fd67d268708a0d936e8f30bbcc2445095c41249646d7f31bf2b9314"} Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.358501 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerStarted","Data":"1a1378ed20cc8ad4f4a8a2333583980af216de872bef1580804b2a41e48152ed"} Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.360353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerStarted","Data":"102f589db7fa912ba9395d959c865c2cb6efdb4f30f2600e800599e05bce2aff"} Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.365824 4669 generic.go:334] "Generic (PLEG): container finished" podID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerID="e36771dab9db36589cddb757be899da08d4ce5c6ea2b0e34e98cfc390e2a9cd1" exitCode=0 Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.365902 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerDied","Data":"e36771dab9db36589cddb757be899da08d4ce5c6ea2b0e34e98cfc390e2a9cd1"} Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.367401 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerStarted","Data":"2b0c25ff2a9c9c70be0f2e6a4e3bd7e2a2c2949e284f37b75ff5dfe2e491bb49"} Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.383734 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sf24n"] Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.390017 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.390148 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.390302 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.8902594 +0000 UTC m=+153.989824407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.390607 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.391208 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.891193034 +0000 UTC m=+153.990758041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.393101 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.400099 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf24n"] Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.491397 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.491707 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-catalog-content\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.491804 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.99172104 +0000 UTC m=+154.091286037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.492310 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-utilities\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.492660 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.492730 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjk78\" (UniqueName: \"kubernetes.io/projected/b90bf450-0186-475a-97dd-cb6ad25fb687-kube-api-access-vjk78\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.493239 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:02.993220377 +0000 UTC m=+154.092785404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.522301 4669 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.595953 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.596493 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:03.09641837 +0000 UTC m=+154.195983357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.596570 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjk78\" (UniqueName: \"kubernetes.io/projected/b90bf450-0186-475a-97dd-cb6ad25fb687-kube-api-access-vjk78\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.596671 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-catalog-content\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.596779 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-utilities\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.597457 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-utilities\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.597580 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-catalog-content\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.617468 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.622066 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjk78\" (UniqueName: \"kubernetes.io/projected/b90bf450-0186-475a-97dd-cb6ad25fb687-kube-api-access-vjk78\") pod \"redhat-marketplace-sf24n\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.698206 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed84780-32e8-41fe-a20d-4c7a633ee541-config-volume\") pod \"0ed84780-32e8-41fe-a20d-4c7a633ee541\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.698271 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjqsn\" (UniqueName: \"kubernetes.io/projected/0ed84780-32e8-41fe-a20d-4c7a633ee541-kube-api-access-vjqsn\") pod \"0ed84780-32e8-41fe-a20d-4c7a633ee541\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.698317 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed84780-32e8-41fe-a20d-4c7a633ee541-secret-volume\") pod \"0ed84780-32e8-41fe-a20d-4c7a633ee541\" (UID: \"0ed84780-32e8-41fe-a20d-4c7a633ee541\") " Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.698493 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.699026 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 11:31:03.198986246 +0000 UTC m=+154.298551223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-s7dqt" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.699308 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed84780-32e8-41fe-a20d-4c7a633ee541-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ed84780-32e8-41fe-a20d-4c7a633ee541" (UID: "0ed84780-32e8-41fe-a20d-4c7a633ee541"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.702011 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed84780-32e8-41fe-a20d-4c7a633ee541-kube-api-access-vjqsn" (OuterVolumeSpecName: "kube-api-access-vjqsn") pod "0ed84780-32e8-41fe-a20d-4c7a633ee541" (UID: "0ed84780-32e8-41fe-a20d-4c7a633ee541"). InnerVolumeSpecName "kube-api-access-vjqsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.706415 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed84780-32e8-41fe-a20d-4c7a633ee541-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ed84780-32e8-41fe-a20d-4c7a633ee541" (UID: "0ed84780-32e8-41fe-a20d-4c7a633ee541"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.729395 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.781278 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5s7q"] Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.788284 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed84780-32e8-41fe-a20d-4c7a633ee541" containerName="collect-profiles" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.788343 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed84780-32e8-41fe-a20d-4c7a633ee541" containerName="collect-profiles" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.788549 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed84780-32e8-41fe-a20d-4c7a633ee541" containerName="collect-profiles" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.789683 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.797335 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5s7q"] Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.800400 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:02 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:02 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:02 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.800463 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.801715 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.802159 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed84780-32e8-41fe-a20d-4c7a633ee541-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.802183 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjqsn\" (UniqueName: \"kubernetes.io/projected/0ed84780-32e8-41fe-a20d-4c7a633ee541-kube-api-access-vjqsn\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.802196 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed84780-32e8-41fe-a20d-4c7a633ee541-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:02 crc kubenswrapper[4669]: E1001 11:31:02.802304 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 11:31:03.30224891 +0000 UTC m=+154.401813897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.827249 4669 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T11:31:02.522331254Z","Handler":null,"Name":""} Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.833965 4669 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.834022 4669 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.904012 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhrd\" (UniqueName: \"kubernetes.io/projected/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-kube-api-access-6rhrd\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.904722 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-utilities\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.904850 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.905153 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-catalog-content\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:02 crc kubenswrapper[4669]: I1001 11:31:02.980456 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf24n"] Oct 01 11:31:02 crc kubenswrapper[4669]: W1001 11:31:02.992473 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90bf450_0186_475a_97dd_cb6ad25fb687.slice/crio-1ad7dfe022468d5dfb86e0096c698a667faef5523c53dc3aba55db8f7c55c199 WatchSource:0}: Error finding container 1ad7dfe022468d5dfb86e0096c698a667faef5523c53dc3aba55db8f7c55c199: Status 404 returned error can't find the container with id 1ad7dfe022468d5dfb86e0096c698a667faef5523c53dc3aba55db8f7c55c199 Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.006069 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-catalog-content\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.006177 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhrd\" (UniqueName: \"kubernetes.io/projected/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-kube-api-access-6rhrd\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.006194 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-utilities\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.006626 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-utilities\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.006651 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-catalog-content\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.029954 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhrd\" (UniqueName: \"kubernetes.io/projected/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-kube-api-access-6rhrd\") pod \"redhat-marketplace-d5s7q\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.114104 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.147468 4669 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.147574 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.226166 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-s7dqt\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.311698 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.347070 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.377109 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c","Type":"ContainerStarted","Data":"8fd6d85d2a977364c74425325f2258a2f31492484dc02e13258abde94245fd9e"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.383717 4669 generic.go:334] "Generic (PLEG): container finished" podID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerID="880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97" exitCode=0 Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.383784 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerDied","Data":"880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.390287 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5bzzr"] Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.391822 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.397506 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.399556 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" event={"ID":"1d0ee8e1-4e70-40fe-8780-567c7b49825b","Type":"ContainerStarted","Data":"8e8cfb0ccee8d3c3afedbc8c72c92123a099661d37107c4d9bf5803ef1eb2e67"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.401930 4669 generic.go:334] "Generic (PLEG): container finished" podID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerID="d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab" exitCode=0 Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.401980 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerDied","Data":"d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.402932 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5s7q"] Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.403167 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerStarted","Data":"1ad7dfe022468d5dfb86e0096c698a667faef5523c53dc3aba55db8f7c55c199"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.404203 4669 generic.go:334] "Generic (PLEG): container finished" podID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerID="cff7aa9a4509e7a30757711d6805a6807d49255512c73b8274a04239b818c737" exitCode=0 Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.404247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerDied","Data":"cff7aa9a4509e7a30757711d6805a6807d49255512c73b8274a04239b818c737"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.407210 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bzzr"] Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.407942 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.407997 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk" event={"ID":"0ed84780-32e8-41fe-a20d-4c7a633ee541","Type":"ContainerDied","Data":"21efbfbefc91124dc387c7e3678a72db891feeb8516a2393a85ed4cfc6177508"} Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.408024 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21efbfbefc91124dc387c7e3678a72db891feeb8516a2393a85ed4cfc6177508" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.449480 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.456793 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:31:03 crc kubenswrapper[4669]: W1001 11:31:03.460408 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4423fe7_fc2d_4e49_b39f_a9641ce1c28b.slice/crio-e7a9092997d39da6304a7e4ffd3523d41311b9ad89df253cb01718cbd103812e WatchSource:0}: Error finding container e7a9092997d39da6304a7e4ffd3523d41311b9ad89df253cb01718cbd103812e: Status 404 returned error can't find the container with id e7a9092997d39da6304a7e4ffd3523d41311b9ad89df253cb01718cbd103812e Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.515122 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzvxj\" (UniqueName: \"kubernetes.io/projected/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-kube-api-access-nzvxj\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.515393 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-catalog-content\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.515539 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-utilities\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.618263 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-catalog-content\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.618958 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-utilities\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.618992 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-catalog-content\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.619008 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzvxj\" (UniqueName: \"kubernetes.io/projected/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-kube-api-access-nzvxj\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.619641 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-utilities\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.644734 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzvxj\" (UniqueName: \"kubernetes.io/projected/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-kube-api-access-nzvxj\") pod \"redhat-operators-5bzzr\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.683254 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.720544 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s7dqt"] Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.721965 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.805357 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6pmkp"] Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.807025 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.807361 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:03 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:03 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:03 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.807454 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.897480 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6pmkp"] Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.923346 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-utilities\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.923424 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm4rd\" (UniqueName: \"kubernetes.io/projected/bf3d0eba-0bac-4056-89ba-75708b18ab84-kube-api-access-xm4rd\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:03 crc kubenswrapper[4669]: I1001 11:31:03.923472 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-catalog-content\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.025085 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-catalog-content\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.025190 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-utilities\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.025219 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm4rd\" (UniqueName: \"kubernetes.io/projected/bf3d0eba-0bac-4056-89ba-75708b18ab84-kube-api-access-xm4rd\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.025781 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-catalog-content\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.025808 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-utilities\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.045029 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm4rd\" (UniqueName: \"kubernetes.io/projected/bf3d0eba-0bac-4056-89ba-75708b18ab84-kube-api-access-xm4rd\") pod \"redhat-operators-6pmkp\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.079107 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bzzr"] Oct 01 11:31:04 crc kubenswrapper[4669]: W1001 11:31:04.089749 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59f36b7c_ac0a_4ca4_90e3_2dfd686760fb.slice/crio-047ee96be580d7f80bd5cac485521134f067823abf7e9d0dabd5d78ab11035a7 WatchSource:0}: Error finding container 047ee96be580d7f80bd5cac485521134f067823abf7e9d0dabd5d78ab11035a7: Status 404 returned error can't find the container with id 047ee96be580d7f80bd5cac485521134f067823abf7e9d0dabd5d78ab11035a7 Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.126319 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.355213 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6pmkp"] Oct 01 11:31:04 crc kubenswrapper[4669]: W1001 11:31:04.363495 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf3d0eba_0bac_4056_89ba_75708b18ab84.slice/crio-25dd36e0a0867d6450bac1dd0224984c4002c80bd84e28786af46148fb86b307 WatchSource:0}: Error finding container 25dd36e0a0867d6450bac1dd0224984c4002c80bd84e28786af46148fb86b307: Status 404 returned error can't find the container with id 25dd36e0a0867d6450bac1dd0224984c4002c80bd84e28786af46148fb86b307 Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.415726 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerStarted","Data":"047ee96be580d7f80bd5cac485521134f067823abf7e9d0dabd5d78ab11035a7"} Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.416791 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerStarted","Data":"25dd36e0a0867d6450bac1dd0224984c4002c80bd84e28786af46148fb86b307"} Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.418034 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5s7q" event={"ID":"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b","Type":"ContainerStarted","Data":"e7a9092997d39da6304a7e4ffd3523d41311b9ad89df253cb01718cbd103812e"} Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.419693 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerStarted","Data":"6fb7b1e21dcb53cab148677b9fe69372542a364c3ffdaa9844eb09258acfdfc4"} Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.420678 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" event={"ID":"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e","Type":"ContainerStarted","Data":"cfa5476fd0ffdf7abf4986018c10bb42df2803a4caca90cd9387f3dd7f170eb0"} Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.460165 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.460131251 podStartE2EDuration="3.460131251s" podCreationTimestamp="2025-10-01 11:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:31:04.4386284 +0000 UTC m=+155.538193387" watchObservedRunningTime="2025-10-01 11:31:04.460131251 +0000 UTC m=+155.559696268" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.593227 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.595023 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.597572 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.599860 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.615194 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.736111 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3114d12c-7a10-4421-97dd-027fb8137aef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.736205 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3114d12c-7a10-4421-97dd-027fb8137aef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.759232 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.772811 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-flpxd" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.802128 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:04 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:04 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:04 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.802212 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.850003 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3114d12c-7a10-4421-97dd-027fb8137aef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.850200 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3114d12c-7a10-4421-97dd-027fb8137aef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.851426 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3114d12c-7a10-4421-97dd-027fb8137aef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:04 crc kubenswrapper[4669]: I1001 11:31:04.902367 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3114d12c-7a10-4421-97dd-027fb8137aef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.168991 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.433588 4669 generic.go:334] "Generic (PLEG): container finished" podID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerID="6fb7b1e21dcb53cab148677b9fe69372542a364c3ffdaa9844eb09258acfdfc4" exitCode=0 Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.434483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerDied","Data":"6fb7b1e21dcb53cab148677b9fe69372542a364c3ffdaa9844eb09258acfdfc4"} Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.434718 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.443980 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" event={"ID":"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e","Type":"ContainerStarted","Data":"8e199821b5d5ad2d3a29328637d5e13bbdd1b40cfc62c6518fd8a9064895d91f"} Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.445470 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.450665 4669 generic.go:334] "Generic (PLEG): container finished" podID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerID="f1d13bb1336d7210e7a66c43fd39b9c3aefe50e3a4655d40f75c47741693c89d" exitCode=0 Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.450834 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerDied","Data":"f1d13bb1336d7210e7a66c43fd39b9c3aefe50e3a4655d40f75c47741693c89d"} Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.457678 4669 generic.go:334] "Generic (PLEG): container finished" podID="dd41b5b5-ff57-43a0-ba85-cf8b1428f88c" containerID="8fd6d85d2a977364c74425325f2258a2f31492484dc02e13258abde94245fd9e" exitCode=0 Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.457760 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c","Type":"ContainerDied","Data":"8fd6d85d2a977364c74425325f2258a2f31492484dc02e13258abde94245fd9e"} Oct 01 11:31:05 crc kubenswrapper[4669]: W1001 11:31:05.458763 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3114d12c_7a10_4421_97dd_027fb8137aef.slice/crio-3cbe40b0ea8f711f7b5fddf287f8fb68000b887038ad7afbb8903e951b67ed7e WatchSource:0}: Error finding container 3cbe40b0ea8f711f7b5fddf287f8fb68000b887038ad7afbb8903e951b67ed7e: Status 404 returned error can't find the container with id 3cbe40b0ea8f711f7b5fddf287f8fb68000b887038ad7afbb8903e951b67ed7e Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.478730 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" event={"ID":"1d0ee8e1-4e70-40fe-8780-567c7b49825b","Type":"ContainerStarted","Data":"f4a2c355e64c50833893e16c9289ebecc16772bc72f3df07b9fa2f7cd156580b"} Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.486860 4669 generic.go:334] "Generic (PLEG): container finished" podID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerID="aeb1eeda1cf7231e3442063f4ddc9cfac0a488d7cb844192999ed7abd5f69488" exitCode=0 Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.486981 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerDied","Data":"aeb1eeda1cf7231e3442063f4ddc9cfac0a488d7cb844192999ed7abd5f69488"} Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.490223 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerID="defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e" exitCode=0 Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.490243 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5s7q" event={"ID":"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b","Type":"ContainerDied","Data":"defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e"} Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.497493 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" podStartSLOduration=135.497436523 podStartE2EDuration="2m15.497436523s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:31:05.496763277 +0000 UTC m=+156.596328274" watchObservedRunningTime="2025-10-01 11:31:05.497436523 +0000 UTC m=+156.597001500" Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.566565 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-q8qw4" podStartSLOduration=17.566537706 podStartE2EDuration="17.566537706s" podCreationTimestamp="2025-10-01 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:31:05.559863561 +0000 UTC m=+156.659428558" watchObservedRunningTime="2025-10-01 11:31:05.566537706 +0000 UTC m=+156.666102673" Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.803390 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:05 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:05 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:05 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:05 crc kubenswrapper[4669]: I1001 11:31:05.803476 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.505287 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3114d12c-7a10-4421-97dd-027fb8137aef","Type":"ContainerStarted","Data":"1c0eb9eb1a96875bd4c237c37eecde8a901e990f65cd12d671951a2f13692255"} Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.505345 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3114d12c-7a10-4421-97dd-027fb8137aef","Type":"ContainerStarted","Data":"3cbe40b0ea8f711f7b5fddf287f8fb68000b887038ad7afbb8903e951b67ed7e"} Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.716969 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5bjch" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.734638 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.73461857 podStartE2EDuration="2.73461857s" podCreationTimestamp="2025-10-01 11:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:31:06.527449127 +0000 UTC m=+157.627014104" watchObservedRunningTime="2025-10-01 11:31:06.73461857 +0000 UTC m=+157.834183547" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.798461 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:06 crc kubenswrapper[4669]: [-]has-synced failed: reason withheld Oct 01 11:31:06 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:06 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.798533 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.827190 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.906148 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kube-api-access\") pod \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.906244 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kubelet-dir\") pod \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\" (UID: \"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c\") " Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.906776 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dd41b5b5-ff57-43a0-ba85-cf8b1428f88c" (UID: "dd41b5b5-ff57-43a0-ba85-cf8b1428f88c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.908586 4669 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:06 crc kubenswrapper[4669]: I1001 11:31:06.918906 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dd41b5b5-ff57-43a0-ba85-cf8b1428f88c" (UID: "dd41b5b5-ff57-43a0-ba85-cf8b1428f88c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.009718 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd41b5b5-ff57-43a0-ba85-cf8b1428f88c-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.516783 4669 generic.go:334] "Generic (PLEG): container finished" podID="3114d12c-7a10-4421-97dd-027fb8137aef" containerID="1c0eb9eb1a96875bd4c237c37eecde8a901e990f65cd12d671951a2f13692255" exitCode=0 Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.516866 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3114d12c-7a10-4421-97dd-027fb8137aef","Type":"ContainerDied","Data":"1c0eb9eb1a96875bd4c237c37eecde8a901e990f65cd12d671951a2f13692255"} Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.520876 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd41b5b5-ff57-43a0-ba85-cf8b1428f88c","Type":"ContainerDied","Data":"37edec748fd67d268708a0d936e8f30bbcc2445095c41249646d7f31bf2b9314"} Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.520960 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37edec748fd67d268708a0d936e8f30bbcc2445095c41249646d7f31bf2b9314" Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.521109 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.797852 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 11:31:07 crc kubenswrapper[4669]: [+]has-synced ok Oct 01 11:31:07 crc kubenswrapper[4669]: [+]process-running ok Oct 01 11:31:07 crc kubenswrapper[4669]: healthz check failed Oct 01 11:31:07 crc kubenswrapper[4669]: I1001 11:31:07.797949 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.777779 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.801863 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.806344 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gmqg9" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.844722 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3114d12c-7a10-4421-97dd-027fb8137aef-kubelet-dir\") pod \"3114d12c-7a10-4421-97dd-027fb8137aef\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.844852 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3114d12c-7a10-4421-97dd-027fb8137aef-kube-api-access\") pod \"3114d12c-7a10-4421-97dd-027fb8137aef\" (UID: \"3114d12c-7a10-4421-97dd-027fb8137aef\") " Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.847295 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3114d12c-7a10-4421-97dd-027fb8137aef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3114d12c-7a10-4421-97dd-027fb8137aef" (UID: "3114d12c-7a10-4421-97dd-027fb8137aef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.855505 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3114d12c-7a10-4421-97dd-027fb8137aef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3114d12c-7a10-4421-97dd-027fb8137aef" (UID: "3114d12c-7a10-4421-97dd-027fb8137aef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.946647 4669 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3114d12c-7a10-4421-97dd-027fb8137aef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:08 crc kubenswrapper[4669]: I1001 11:31:08.946680 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3114d12c-7a10-4421-97dd-027fb8137aef-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 11:31:09 crc kubenswrapper[4669]: I1001 11:31:09.547041 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3114d12c-7a10-4421-97dd-027fb8137aef","Type":"ContainerDied","Data":"3cbe40b0ea8f711f7b5fddf287f8fb68000b887038ad7afbb8903e951b67ed7e"} Oct 01 11:31:09 crc kubenswrapper[4669]: I1001 11:31:09.547114 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cbe40b0ea8f711f7b5fddf287f8fb68000b887038ad7afbb8903e951b67ed7e" Oct 01 11:31:09 crc kubenswrapper[4669]: I1001 11:31:09.547291 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 11:31:11 crc kubenswrapper[4669]: I1001 11:31:11.295055 4669 patch_prober.go:28] interesting pod/console-f9d7485db-cclkd container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 01 11:31:11 crc kubenswrapper[4669]: I1001 11:31:11.295747 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cclkd" podUID="1467a745-44bf-40c6-a065-5008543d1363" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 01 11:31:11 crc kubenswrapper[4669]: I1001 11:31:11.409190 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:11 crc kubenswrapper[4669]: I1001 11:31:11.409248 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:11 crc kubenswrapper[4669]: I1001 11:31:11.409826 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:11 crc kubenswrapper[4669]: I1001 11:31:11.409854 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:12 crc kubenswrapper[4669]: I1001 11:31:12.722046 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:31:12 crc kubenswrapper[4669]: I1001 11:31:12.732274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30ba513f-67c5-4e4f-b8a7-be9c67660bec-metrics-certs\") pod \"network-metrics-daemon-wvnw6\" (UID: \"30ba513f-67c5-4e4f-b8a7-be9c67660bec\") " pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:31:12 crc kubenswrapper[4669]: I1001 11:31:12.973559 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvnw6" Oct 01 11:31:13 crc kubenswrapper[4669]: I1001 11:31:13.510421 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wvnw6"] Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.304938 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.311359 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.409926 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.410015 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.409944 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.410174 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.410209 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.411125 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"51a63a4d29c240869363fa77a1163ef0aab8a703d656590d7173d36995488338"} pod="openshift-console/downloads-7954f5f757-hdrzq" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.411289 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" containerID="cri-o://51a63a4d29c240869363fa77a1163ef0aab8a703d656590d7173d36995488338" gracePeriod=2 Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.411612 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:21 crc kubenswrapper[4669]: I1001 11:31:21.411712 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:22 crc kubenswrapper[4669]: I1001 11:31:22.836516 4669 patch_prober.go:28] interesting pod/router-default-5444994796-gmqg9 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 11:31:22 crc kubenswrapper[4669]: I1001 11:31:22.837151 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-gmqg9" podUID="cfd95c71-623e-4ee4-aadf-752a8e07d362" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 11:31:23 crc kubenswrapper[4669]: I1001 11:31:23.456744 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:31:26 crc kubenswrapper[4669]: I1001 11:31:26.682231 4669 generic.go:334] "Generic (PLEG): container finished" podID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerID="51a63a4d29c240869363fa77a1163ef0aab8a703d656590d7173d36995488338" exitCode=0 Oct 01 11:31:26 crc kubenswrapper[4669]: I1001 11:31:26.682309 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hdrzq" event={"ID":"54979db4-1c85-4bfd-aec1-c154590ec33b","Type":"ContainerDied","Data":"51a63a4d29c240869363fa77a1163ef0aab8a703d656590d7173d36995488338"} Oct 01 11:31:31 crc kubenswrapper[4669]: I1001 11:31:31.410841 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:31 crc kubenswrapper[4669]: I1001 11:31:31.412002 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:31 crc kubenswrapper[4669]: I1001 11:31:31.864299 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:31:31 crc kubenswrapper[4669]: I1001 11:31:31.864379 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:31:32 crc kubenswrapper[4669]: I1001 11:31:32.163420 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h4bpt" Oct 01 11:31:37 crc kubenswrapper[4669]: I1001 11:31:37.786369 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 11:31:41 crc kubenswrapper[4669]: I1001 11:31:41.410170 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:41 crc kubenswrapper[4669]: I1001 11:31:41.410633 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:41 crc kubenswrapper[4669]: I1001 11:31:41.783634 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" event={"ID":"30ba513f-67c5-4e4f-b8a7-be9c67660bec","Type":"ContainerStarted","Data":"15c0f21b3a6b722a652045932caae3fa76af92e02b448065d704143e21f75c1b"} Oct 01 11:31:51 crc kubenswrapper[4669]: I1001 11:31:51.410847 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:31:51 crc kubenswrapper[4669]: I1001 11:31:51.411580 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:31:56 crc kubenswrapper[4669]: E1001 11:31:56.665007 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 11:31:56 crc kubenswrapper[4669]: E1001 11:31:56.665602 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xktr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hwdf6_openshift-marketplace(ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:31:56 crc kubenswrapper[4669]: E1001 11:31:56.666895 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hwdf6" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.120732 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hwdf6" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.126774 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:7fd0dc32cc360fd8eccbeef60647eb669da91b47f9b9e7a82238ffe30f860285: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:7fd0dc32cc360fd8eccbeef60647eb669da91b47f9b9e7a82238ffe30f860285\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.127143 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm4rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6pmkp_openshift-marketplace(bf3d0eba-0bac-4056-89ba-75708b18ab84): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:7fd0dc32cc360fd8eccbeef60647eb669da91b47f9b9e7a82238ffe30f860285: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:7fd0dc32cc360fd8eccbeef60647eb669da91b47f9b9e7a82238ffe30f860285\": context canceled" logger="UnhandledError" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.128349 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:7fd0dc32cc360fd8eccbeef60647eb669da91b47f9b9e7a82238ffe30f860285: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:7fd0dc32cc360fd8eccbeef60647eb669da91b47f9b9e7a82238ffe30f860285\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-6pmkp" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.390209 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.390806 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwf2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p69c8_openshift-marketplace(7b67d961-10c4-45b0-84e0-99ff6afc366a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.392443 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p69c8" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.695305 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.695495 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl82v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kk6f2_openshift-marketplace(356dfcd7-c70a-4494-aeed-89aa3393ecd9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.696748 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kk6f2" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.730046 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.730265 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tccjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-n2l4s_openshift-marketplace(0b8d4849-98dd-4b1b-90dc-9151e8b17224): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:31:59 crc kubenswrapper[4669]: E1001 11:31:59.731691 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-n2l4s" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.061313 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6pmkp" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.101279 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.101864 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rhrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d5s7q_openshift-marketplace(b4423fe7-fc2d-4e49-b39f-a9641ce1c28b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.103158 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d5s7q" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.114124 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.114395 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjk78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sf24n_openshift-marketplace(b90bf450-0186-475a-97dd-cb6ad25fb687): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:32:00 crc kubenswrapper[4669]: E1001 11:32:00.115666 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sf24n" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.409626 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.409692 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.863395 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.863747 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.863796 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.864409 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:32:01 crc kubenswrapper[4669]: I1001 11:32:01.864457 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054" gracePeriod=600 Oct 01 11:32:02 crc kubenswrapper[4669]: I1001 11:32:02.931230 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054" exitCode=0 Oct 01 11:32:02 crc kubenswrapper[4669]: I1001 11:32:02.931280 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054"} Oct 01 11:32:03 crc kubenswrapper[4669]: E1001 11:32:03.474498 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d5s7q" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" Oct 01 11:32:03 crc kubenswrapper[4669]: E1001 11:32:03.474887 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sf24n" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" Oct 01 11:32:03 crc kubenswrapper[4669]: E1001 11:32:03.522252 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 11:32:03 crc kubenswrapper[4669]: E1001 11:32:03.522526 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzvxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5bzzr_openshift-marketplace(59f36b7c-ac0a-4ca4-90e3-2dfd686760fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 11:32:03 crc kubenswrapper[4669]: E1001 11:32:03.523820 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5bzzr" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" Oct 01 11:32:03 crc kubenswrapper[4669]: I1001 11:32:03.942532 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" event={"ID":"30ba513f-67c5-4e4f-b8a7-be9c67660bec","Type":"ContainerStarted","Data":"48ee50a8b4bec5f40c789de81a7672903482f2a5f37c6a017e6fcfa107e16102"} Oct 01 11:32:03 crc kubenswrapper[4669]: I1001 11:32:03.948875 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"8027885e355d02196c881ebc15cce3dfddbca8c6fa333e055455ca80503be475"} Oct 01 11:32:03 crc kubenswrapper[4669]: I1001 11:32:03.956609 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hdrzq" event={"ID":"54979db4-1c85-4bfd-aec1-c154590ec33b","Type":"ContainerStarted","Data":"960bee0f0f44655a1af3c741d8cd07945c4bbf878e204e90a487174314a85178"} Oct 01 11:32:03 crc kubenswrapper[4669]: I1001 11:32:03.956674 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:32:03 crc kubenswrapper[4669]: I1001 11:32:03.956739 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:32:03 crc kubenswrapper[4669]: I1001 11:32:03.956765 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:32:03 crc kubenswrapper[4669]: E1001 11:32:03.958069 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5bzzr" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" Oct 01 11:32:04 crc kubenswrapper[4669]: I1001 11:32:04.978376 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wvnw6" event={"ID":"30ba513f-67c5-4e4f-b8a7-be9c67660bec","Type":"ContainerStarted","Data":"4c615b0ad9443a750420b1cc339b79aec198506538bbcb64b1bc30697d2ef2f9"} Oct 01 11:32:04 crc kubenswrapper[4669]: I1001 11:32:04.979258 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:32:04 crc kubenswrapper[4669]: I1001 11:32:04.979360 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:32:05 crc kubenswrapper[4669]: I1001 11:32:05.020623 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wvnw6" podStartSLOduration=195.020580497 podStartE2EDuration="3m15.020580497s" podCreationTimestamp="2025-10-01 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:32:05.013897402 +0000 UTC m=+216.113462379" watchObservedRunningTime="2025-10-01 11:32:05.020580497 +0000 UTC m=+216.120145524" Oct 01 11:32:11 crc kubenswrapper[4669]: I1001 11:32:11.410033 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:32:11 crc kubenswrapper[4669]: I1001 11:32:11.413025 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:32:11 crc kubenswrapper[4669]: I1001 11:32:11.410128 4669 patch_prober.go:28] interesting pod/downloads-7954f5f757-hdrzq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 01 11:32:11 crc kubenswrapper[4669]: I1001 11:32:11.413448 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hdrzq" podUID="54979db4-1c85-4bfd-aec1-c154590ec33b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 01 11:32:21 crc kubenswrapper[4669]: I1001 11:32:21.424935 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hdrzq" Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.135340 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerStarted","Data":"427d0110cf2f090a8104120d99740985c7bb63326d980c7e8a52cb28860fdf9e"} Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.138750 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerStarted","Data":"7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370"} Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.141488 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerStarted","Data":"8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896"} Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.143695 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerStarted","Data":"41ee1a4fc56f4fc6e64f73d43b070b319a9d2258aa1477fcacaa7250b8dc23a2"} Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.145861 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerID="93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3" exitCode=0 Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.145931 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5s7q" event={"ID":"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b","Type":"ContainerDied","Data":"93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3"} Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.148784 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerStarted","Data":"792934d93e8496f47e67defb5cd77b2defaf703d608aadd33fedafdb983c8eb2"} Oct 01 11:32:25 crc kubenswrapper[4669]: I1001 11:32:25.150897 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerStarted","Data":"39272a89e432cd886731b92af37685403485cc483a13b91852dcc61c32bdc07f"} Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.161587 4669 generic.go:334] "Generic (PLEG): container finished" podID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerID="7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370" exitCode=0 Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.161691 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerDied","Data":"7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370"} Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.167576 4669 generic.go:334] "Generic (PLEG): container finished" podID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerID="8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896" exitCode=0 Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.167648 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerDied","Data":"8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896"} Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.173429 4669 generic.go:334] "Generic (PLEG): container finished" podID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerID="41ee1a4fc56f4fc6e64f73d43b070b319a9d2258aa1477fcacaa7250b8dc23a2" exitCode=0 Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.173522 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerDied","Data":"41ee1a4fc56f4fc6e64f73d43b070b319a9d2258aa1477fcacaa7250b8dc23a2"} Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.177204 4669 generic.go:334] "Generic (PLEG): container finished" podID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerID="427d0110cf2f090a8104120d99740985c7bb63326d980c7e8a52cb28860fdf9e" exitCode=0 Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.177298 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerDied","Data":"427d0110cf2f090a8104120d99740985c7bb63326d980c7e8a52cb28860fdf9e"} Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.180275 4669 generic.go:334] "Generic (PLEG): container finished" podID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerID="792934d93e8496f47e67defb5cd77b2defaf703d608aadd33fedafdb983c8eb2" exitCode=0 Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.180338 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerDied","Data":"792934d93e8496f47e67defb5cd77b2defaf703d608aadd33fedafdb983c8eb2"} Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.183221 4669 generic.go:334] "Generic (PLEG): container finished" podID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerID="39272a89e432cd886731b92af37685403485cc483a13b91852dcc61c32bdc07f" exitCode=0 Oct 01 11:32:26 crc kubenswrapper[4669]: I1001 11:32:26.183246 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerDied","Data":"39272a89e432cd886731b92af37685403485cc483a13b91852dcc61c32bdc07f"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.429133 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerStarted","Data":"085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.432055 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerStarted","Data":"7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.435335 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerStarted","Data":"fd2323c1496ccf1c805c621a5398b97b9f41f3f3f9aa50fb6ee9681c7b87ee43"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.439575 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5s7q" event={"ID":"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b","Type":"ContainerStarted","Data":"8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.444368 4669 generic.go:334] "Generic (PLEG): container finished" podID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerID="c7c24d89b46ecd7f19910f77216500c0fae08582fb17ff94f8b755fbcb644374" exitCode=0 Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.444457 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerDied","Data":"c7c24d89b46ecd7f19910f77216500c0fae08582fb17ff94f8b755fbcb644374"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.449000 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerStarted","Data":"e7adea424c81018c5f2fa03858830fa3d08c47d007ac1f19d5b52a1ed6c6ea11"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.453551 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerStarted","Data":"4f90ce6d0e16868414807f984f91f54c4b61b974f8210e4f85f7bd79109aaf37"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.462666 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerStarted","Data":"08474a5302b079f5e0568daadc3759cb4c0480594fa7b7ddbddbf0e93732786c"} Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.464405 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p69c8" podStartSLOduration=5.789768496 podStartE2EDuration="2m0.464062147s" podCreationTimestamp="2025-10-01 11:31:00 +0000 UTC" firstStartedPulling="2025-10-01 11:31:04.423591281 +0000 UTC m=+155.523156258" lastFinishedPulling="2025-10-01 11:32:59.097884902 +0000 UTC m=+270.197449909" observedRunningTime="2025-10-01 11:33:00.455368703 +0000 UTC m=+271.554933690" watchObservedRunningTime="2025-10-01 11:33:00.464062147 +0000 UTC m=+271.563627124" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.480179 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6pmkp" podStartSLOduration=7.038408316 podStartE2EDuration="1m57.480155453s" podCreationTimestamp="2025-10-01 11:31:03 +0000 UTC" firstStartedPulling="2025-10-01 11:31:05.513178452 +0000 UTC m=+156.612743429" lastFinishedPulling="2025-10-01 11:32:55.954925549 +0000 UTC m=+267.054490566" observedRunningTime="2025-10-01 11:33:00.477644801 +0000 UTC m=+271.577209788" watchObservedRunningTime="2025-10-01 11:33:00.480155453 +0000 UTC m=+271.579720450" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.499903 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5bzzr" podStartSLOduration=15.226718098 podStartE2EDuration="1m57.499877899s" podCreationTimestamp="2025-10-01 11:31:03 +0000 UTC" firstStartedPulling="2025-10-01 11:31:05.456363791 +0000 UTC m=+156.555928768" lastFinishedPulling="2025-10-01 11:32:47.729523592 +0000 UTC m=+258.829088569" observedRunningTime="2025-10-01 11:33:00.49869999 +0000 UTC m=+271.598264987" watchObservedRunningTime="2025-10-01 11:33:00.499877899 +0000 UTC m=+271.599442876" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.525994 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hwdf6" podStartSLOduration=4.883639122 podStartE2EDuration="2m0.525977442s" podCreationTimestamp="2025-10-01 11:31:00 +0000 UTC" firstStartedPulling="2025-10-01 11:31:03.457193373 +0000 UTC m=+154.556758350" lastFinishedPulling="2025-10-01 11:32:59.099531683 +0000 UTC m=+270.199096670" observedRunningTime="2025-10-01 11:33:00.523579423 +0000 UTC m=+271.623144400" watchObservedRunningTime="2025-10-01 11:33:00.525977442 +0000 UTC m=+271.625542419" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.530620 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.530699 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.553874 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5s7q" podStartSLOduration=6.441285982 podStartE2EDuration="1m58.553855848s" podCreationTimestamp="2025-10-01 11:31:02 +0000 UTC" firstStartedPulling="2025-10-01 11:31:05.513203162 +0000 UTC m=+156.612768139" lastFinishedPulling="2025-10-01 11:32:57.625772998 +0000 UTC m=+268.725338005" observedRunningTime="2025-10-01 11:33:00.550120457 +0000 UTC m=+271.649685444" watchObservedRunningTime="2025-10-01 11:33:00.553855848 +0000 UTC m=+271.653420825" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.620412 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk6f2" podStartSLOduration=5.945900801 podStartE2EDuration="2m0.620385938s" podCreationTimestamp="2025-10-01 11:31:00 +0000 UTC" firstStartedPulling="2025-10-01 11:31:04.42317739 +0000 UTC m=+155.522742367" lastFinishedPulling="2025-10-01 11:32:59.097662527 +0000 UTC m=+270.197227504" observedRunningTime="2025-10-01 11:33:00.615571739 +0000 UTC m=+271.715136726" watchObservedRunningTime="2025-10-01 11:33:00.620385938 +0000 UTC m=+271.719950915" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.638088 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n2l4s" podStartSLOduration=4.994702569 podStartE2EDuration="2m0.638052943s" podCreationTimestamp="2025-10-01 11:31:00 +0000 UTC" firstStartedPulling="2025-10-01 11:31:03.456440125 +0000 UTC m=+154.556005102" lastFinishedPulling="2025-10-01 11:32:59.099790469 +0000 UTC m=+270.199355476" observedRunningTime="2025-10-01 11:33:00.635645674 +0000 UTC m=+271.735210661" watchObservedRunningTime="2025-10-01 11:33:00.638052943 +0000 UTC m=+271.737617920" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.922574 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:33:00 crc kubenswrapper[4669]: I1001 11:33:00.922635 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.419859 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.420346 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.420362 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.420376 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.481056 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerStarted","Data":"4a652f988e42c4d1a7d69825106cc701a1ffd87c0df6c8d6820858a0990fd8ad"} Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.501680 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sf24n" podStartSLOduration=3.909824183 podStartE2EDuration="1m59.501655397s" podCreationTimestamp="2025-10-01 11:31:02 +0000 UTC" firstStartedPulling="2025-10-01 11:31:05.436769159 +0000 UTC m=+156.536334136" lastFinishedPulling="2025-10-01 11:33:01.028600373 +0000 UTC m=+272.128165350" observedRunningTime="2025-10-01 11:33:01.500733365 +0000 UTC m=+272.600298342" watchObservedRunningTime="2025-10-01 11:33:01.501655397 +0000 UTC m=+272.601220374" Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.785755 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-n2l4s" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="registry-server" probeResult="failure" output=< Oct 01 11:33:01 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:33:01 crc kubenswrapper[4669]: > Oct 01 11:33:01 crc kubenswrapper[4669]: I1001 11:33:01.966223 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-p69c8" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="registry-server" probeResult="failure" output=< Oct 01 11:33:01 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:33:01 crc kubenswrapper[4669]: > Oct 01 11:33:02 crc kubenswrapper[4669]: I1001 11:33:02.461916 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hwdf6" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="registry-server" probeResult="failure" output=< Oct 01 11:33:02 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:33:02 crc kubenswrapper[4669]: > Oct 01 11:33:02 crc kubenswrapper[4669]: I1001 11:33:02.484542 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-kk6f2" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="registry-server" probeResult="failure" output=< Oct 01 11:33:02 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:33:02 crc kubenswrapper[4669]: > Oct 01 11:33:02 crc kubenswrapper[4669]: I1001 11:33:02.730539 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:33:02 crc kubenswrapper[4669]: I1001 11:33:02.731062 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:33:02 crc kubenswrapper[4669]: I1001 11:33:02.802973 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:33:03 crc kubenswrapper[4669]: I1001 11:33:03.115172 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:33:03 crc kubenswrapper[4669]: I1001 11:33:03.115223 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:33:03 crc kubenswrapper[4669]: I1001 11:33:03.166660 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:33:03 crc kubenswrapper[4669]: I1001 11:33:03.722944 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:33:03 crc kubenswrapper[4669]: I1001 11:33:03.723006 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:33:04 crc kubenswrapper[4669]: I1001 11:33:04.126492 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:33:04 crc kubenswrapper[4669]: I1001 11:33:04.126589 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:33:04 crc kubenswrapper[4669]: I1001 11:33:04.763034 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bzzr" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="registry-server" probeResult="failure" output=< Oct 01 11:33:04 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:33:04 crc kubenswrapper[4669]: > Oct 01 11:33:05 crc kubenswrapper[4669]: I1001 11:33:05.185193 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6pmkp" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="registry-server" probeResult="failure" output=< Oct 01 11:33:05 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:33:05 crc kubenswrapper[4669]: > Oct 01 11:33:10 crc kubenswrapper[4669]: I1001 11:33:10.578202 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:33:10 crc kubenswrapper[4669]: I1001 11:33:10.642985 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:33:10 crc kubenswrapper[4669]: I1001 11:33:10.963155 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:33:11 crc kubenswrapper[4669]: I1001 11:33:11.009994 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:33:11 crc kubenswrapper[4669]: I1001 11:33:11.466561 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:33:11 crc kubenswrapper[4669]: I1001 11:33:11.471036 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:33:11 crc kubenswrapper[4669]: I1001 11:33:11.508519 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:33:11 crc kubenswrapper[4669]: I1001 11:33:11.521276 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:33:12 crc kubenswrapper[4669]: I1001 11:33:12.013786 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p69c8"] Oct 01 11:33:12 crc kubenswrapper[4669]: I1001 11:33:12.550460 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p69c8" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="registry-server" containerID="cri-o://085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51" gracePeriod=2 Oct 01 11:33:12 crc kubenswrapper[4669]: E1001 11:33:12.681004 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b67d961_10c4_45b0_84e0_99ff6afc366a.slice/crio-conmon-085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51.scope\": RecentStats: unable to find data in memory cache]" Oct 01 11:33:12 crc kubenswrapper[4669]: I1001 11:33:12.784674 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:33:12 crc kubenswrapper[4669]: I1001 11:33:12.950952 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.042838 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-utilities\") pod \"7b67d961-10c4-45b0-84e0-99ff6afc366a\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.042945 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-catalog-content\") pod \"7b67d961-10c4-45b0-84e0-99ff6afc366a\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.043058 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwf2w\" (UniqueName: \"kubernetes.io/projected/7b67d961-10c4-45b0-84e0-99ff6afc366a-kube-api-access-qwf2w\") pod \"7b67d961-10c4-45b0-84e0-99ff6afc366a\" (UID: \"7b67d961-10c4-45b0-84e0-99ff6afc366a\") " Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.043649 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-utilities" (OuterVolumeSpecName: "utilities") pod "7b67d961-10c4-45b0-84e0-99ff6afc366a" (UID: "7b67d961-10c4-45b0-84e0-99ff6afc366a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.049762 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b67d961-10c4-45b0-84e0-99ff6afc366a-kube-api-access-qwf2w" (OuterVolumeSpecName: "kube-api-access-qwf2w") pod "7b67d961-10c4-45b0-84e0-99ff6afc366a" (UID: "7b67d961-10c4-45b0-84e0-99ff6afc366a"). InnerVolumeSpecName "kube-api-access-qwf2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.085278 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b67d961-10c4-45b0-84e0-99ff6afc366a" (UID: "7b67d961-10c4-45b0-84e0-99ff6afc366a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.145327 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.145377 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b67d961-10c4-45b0-84e0-99ff6afc366a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.145393 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwf2w\" (UniqueName: \"kubernetes.io/projected/7b67d961-10c4-45b0-84e0-99ff6afc366a-kube-api-access-qwf2w\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.158577 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.559519 4669 generic.go:334] "Generic (PLEG): container finished" podID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerID="085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51" exitCode=0 Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.559587 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerDied","Data":"085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51"} Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.559678 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p69c8" event={"ID":"7b67d961-10c4-45b0-84e0-99ff6afc366a","Type":"ContainerDied","Data":"1a1378ed20cc8ad4f4a8a2333583980af216de872bef1580804b2a41e48152ed"} Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.559704 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p69c8" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.559717 4669 scope.go:117] "RemoveContainer" containerID="085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.580874 4669 scope.go:117] "RemoveContainer" containerID="7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.595698 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p69c8"] Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.597995 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p69c8"] Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.614609 4669 scope.go:117] "RemoveContainer" containerID="880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.631834 4669 scope.go:117] "RemoveContainer" containerID="085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51" Oct 01 11:33:13 crc kubenswrapper[4669]: E1001 11:33:13.632427 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51\": container with ID starting with 085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51 not found: ID does not exist" containerID="085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.632497 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51"} err="failed to get container status \"085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51\": rpc error: code = NotFound desc = could not find container \"085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51\": container with ID starting with 085ca2dcf7611bb4a565c34402dc5eaf6ac1e2ccbbc81bfcf09ff69c9af18c51 not found: ID does not exist" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.632537 4669 scope.go:117] "RemoveContainer" containerID="7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370" Oct 01 11:33:13 crc kubenswrapper[4669]: E1001 11:33:13.633021 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370\": container with ID starting with 7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370 not found: ID does not exist" containerID="7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.633103 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370"} err="failed to get container status \"7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370\": rpc error: code = NotFound desc = could not find container \"7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370\": container with ID starting with 7ea20c2ee7a98e1dbe4e82987bbb56c0a72d51a0cf01273cd112a6856c4d6370 not found: ID does not exist" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.633147 4669 scope.go:117] "RemoveContainer" containerID="880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97" Oct 01 11:33:13 crc kubenswrapper[4669]: E1001 11:33:13.633744 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97\": container with ID starting with 880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97 not found: ID does not exist" containerID="880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.633785 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97"} err="failed to get container status \"880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97\": rpc error: code = NotFound desc = could not find container \"880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97\": container with ID starting with 880dd85834b79bb98f10c3f0ea13d9fb3d66165a1b1da19c6a29ab7b13cd8d97 not found: ID does not exist" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.653270 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" path="/var/lib/kubelet/pods/7b67d961-10c4-45b0-84e0-99ff6afc366a/volumes" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.760884 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.801858 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.817210 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk6f2"] Oct 01 11:33:13 crc kubenswrapper[4669]: I1001 11:33:13.818228 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk6f2" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="registry-server" containerID="cri-o://7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63" gracePeriod=2 Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.275246 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.322916 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.331917 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.467322 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-utilities\") pod \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.467469 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-catalog-content\") pod \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.467532 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl82v\" (UniqueName: \"kubernetes.io/projected/356dfcd7-c70a-4494-aeed-89aa3393ecd9-kube-api-access-sl82v\") pod \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\" (UID: \"356dfcd7-c70a-4494-aeed-89aa3393ecd9\") " Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.468312 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-utilities" (OuterVolumeSpecName: "utilities") pod "356dfcd7-c70a-4494-aeed-89aa3393ecd9" (UID: "356dfcd7-c70a-4494-aeed-89aa3393ecd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.468834 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.472391 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356dfcd7-c70a-4494-aeed-89aa3393ecd9-kube-api-access-sl82v" (OuterVolumeSpecName: "kube-api-access-sl82v") pod "356dfcd7-c70a-4494-aeed-89aa3393ecd9" (UID: "356dfcd7-c70a-4494-aeed-89aa3393ecd9"). InnerVolumeSpecName "kube-api-access-sl82v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.514136 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "356dfcd7-c70a-4494-aeed-89aa3393ecd9" (UID: "356dfcd7-c70a-4494-aeed-89aa3393ecd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.570163 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356dfcd7-c70a-4494-aeed-89aa3393ecd9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.570226 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl82v\" (UniqueName: \"kubernetes.io/projected/356dfcd7-c70a-4494-aeed-89aa3393ecd9-kube-api-access-sl82v\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.570704 4669 generic.go:334] "Generic (PLEG): container finished" podID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerID="7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63" exitCode=0 Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.570779 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerDied","Data":"7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63"} Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.570864 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk6f2" event={"ID":"356dfcd7-c70a-4494-aeed-89aa3393ecd9","Type":"ContainerDied","Data":"102f589db7fa912ba9395d959c865c2cb6efdb4f30f2600e800599e05bce2aff"} Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.570905 4669 scope.go:117] "RemoveContainer" containerID="7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.571523 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk6f2" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.591670 4669 scope.go:117] "RemoveContainer" containerID="8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.604516 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk6f2"] Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.607275 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk6f2"] Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.629731 4669 scope.go:117] "RemoveContainer" containerID="d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.644306 4669 scope.go:117] "RemoveContainer" containerID="7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63" Oct 01 11:33:14 crc kubenswrapper[4669]: E1001 11:33:14.645068 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63\": container with ID starting with 7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63 not found: ID does not exist" containerID="7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.645157 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63"} err="failed to get container status \"7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63\": rpc error: code = NotFound desc = could not find container \"7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63\": container with ID starting with 7e9fc706d107691a14030bb8ff48c97a1a9205d97ec0ee5d7b6e2a365df0fc63 not found: ID does not exist" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.645197 4669 scope.go:117] "RemoveContainer" containerID="8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896" Oct 01 11:33:14 crc kubenswrapper[4669]: E1001 11:33:14.645800 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896\": container with ID starting with 8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896 not found: ID does not exist" containerID="8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.645834 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896"} err="failed to get container status \"8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896\": rpc error: code = NotFound desc = could not find container \"8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896\": container with ID starting with 8728f5a23dd835d9f5b43c0c655d1853100e6b2ffd7f627fcc2322ad809c0896 not found: ID does not exist" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.645859 4669 scope.go:117] "RemoveContainer" containerID="d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab" Oct 01 11:33:14 crc kubenswrapper[4669]: E1001 11:33:14.646237 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab\": container with ID starting with d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab not found: ID does not exist" containerID="d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab" Oct 01 11:33:14 crc kubenswrapper[4669]: I1001 11:33:14.646262 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab"} err="failed to get container status \"d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab\": rpc error: code = NotFound desc = could not find container \"d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab\": container with ID starting with d8ccecc384ce56c09e747ab2d173a64f7a4de94f506761bb48dae3ee99a0acab not found: ID does not exist" Oct 01 11:33:15 crc kubenswrapper[4669]: I1001 11:33:15.653111 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" path="/var/lib/kubelet/pods/356dfcd7-c70a-4494-aeed-89aa3393ecd9/volumes" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.218598 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5s7q"] Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.220050 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5s7q" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="registry-server" containerID="cri-o://8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070" gracePeriod=2 Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.414935 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6pmkp"] Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.415397 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6pmkp" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="registry-server" containerID="cri-o://fd2323c1496ccf1c805c621a5398b97b9f41f3f3f9aa50fb6ee9681c7b87ee43" gracePeriod=2 Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.564284 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.609640 4669 generic.go:334] "Generic (PLEG): container finished" podID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerID="8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070" exitCode=0 Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.609859 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5s7q" event={"ID":"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b","Type":"ContainerDied","Data":"8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070"} Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.609900 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5s7q" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.609936 4669 scope.go:117] "RemoveContainer" containerID="8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.609920 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5s7q" event={"ID":"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b","Type":"ContainerDied","Data":"e7a9092997d39da6304a7e4ffd3523d41311b9ad89df253cb01718cbd103812e"} Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.625131 4669 generic.go:334] "Generic (PLEG): container finished" podID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerID="fd2323c1496ccf1c805c621a5398b97b9f41f3f3f9aa50fb6ee9681c7b87ee43" exitCode=0 Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.625185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerDied","Data":"fd2323c1496ccf1c805c621a5398b97b9f41f3f3f9aa50fb6ee9681c7b87ee43"} Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.627945 4669 scope.go:117] "RemoveContainer" containerID="93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.643359 4669 scope.go:117] "RemoveContainer" containerID="defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.671141 4669 scope.go:117] "RemoveContainer" containerID="8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070" Oct 01 11:33:16 crc kubenswrapper[4669]: E1001 11:33:16.671627 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070\": container with ID starting with 8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070 not found: ID does not exist" containerID="8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.671659 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070"} err="failed to get container status \"8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070\": rpc error: code = NotFound desc = could not find container \"8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070\": container with ID starting with 8f4b7a569b16ac7b4ecbc28cc2b1eb1ecc15ea38aa3cf0c7d4c1eb37915bb070 not found: ID does not exist" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.671682 4669 scope.go:117] "RemoveContainer" containerID="93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3" Oct 01 11:33:16 crc kubenswrapper[4669]: E1001 11:33:16.672673 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3\": container with ID starting with 93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3 not found: ID does not exist" containerID="93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.672709 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3"} err="failed to get container status \"93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3\": rpc error: code = NotFound desc = could not find container \"93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3\": container with ID starting with 93554a740f592e4abc7455f208beca4d5e35136b57234c0fb329c41ae1b5b6d3 not found: ID does not exist" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.672726 4669 scope.go:117] "RemoveContainer" containerID="defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e" Oct 01 11:33:16 crc kubenswrapper[4669]: E1001 11:33:16.673012 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e\": container with ID starting with defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e not found: ID does not exist" containerID="defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.673030 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e"} err="failed to get container status \"defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e\": rpc error: code = NotFound desc = could not find container \"defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e\": container with ID starting with defcc4511ac1de83acdaaddef2435041c281afff653ff805b27e076ca0d6680e not found: ID does not exist" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.707253 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rhrd\" (UniqueName: \"kubernetes.io/projected/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-kube-api-access-6rhrd\") pod \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.707319 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-catalog-content\") pod \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.707384 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-utilities\") pod \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\" (UID: \"b4423fe7-fc2d-4e49-b39f-a9641ce1c28b\") " Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.708492 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-utilities" (OuterVolumeSpecName: "utilities") pod "b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" (UID: "b4423fe7-fc2d-4e49-b39f-a9641ce1c28b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.716140 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-kube-api-access-6rhrd" (OuterVolumeSpecName: "kube-api-access-6rhrd") pod "b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" (UID: "b4423fe7-fc2d-4e49-b39f-a9641ce1c28b"). InnerVolumeSpecName "kube-api-access-6rhrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.730482 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.730562 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" (UID: "b4423fe7-fc2d-4e49-b39f-a9641ce1c28b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.809402 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm4rd\" (UniqueName: \"kubernetes.io/projected/bf3d0eba-0bac-4056-89ba-75708b18ab84-kube-api-access-xm4rd\") pod \"bf3d0eba-0bac-4056-89ba-75708b18ab84\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.809478 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-catalog-content\") pod \"bf3d0eba-0bac-4056-89ba-75708b18ab84\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.809517 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-utilities\") pod \"bf3d0eba-0bac-4056-89ba-75708b18ab84\" (UID: \"bf3d0eba-0bac-4056-89ba-75708b18ab84\") " Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.809749 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rhrd\" (UniqueName: \"kubernetes.io/projected/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-kube-api-access-6rhrd\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.809766 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.809779 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.810681 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-utilities" (OuterVolumeSpecName: "utilities") pod "bf3d0eba-0bac-4056-89ba-75708b18ab84" (UID: "bf3d0eba-0bac-4056-89ba-75708b18ab84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.815427 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3d0eba-0bac-4056-89ba-75708b18ab84-kube-api-access-xm4rd" (OuterVolumeSpecName: "kube-api-access-xm4rd") pod "bf3d0eba-0bac-4056-89ba-75708b18ab84" (UID: "bf3d0eba-0bac-4056-89ba-75708b18ab84"). InnerVolumeSpecName "kube-api-access-xm4rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.899048 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf3d0eba-0bac-4056-89ba-75708b18ab84" (UID: "bf3d0eba-0bac-4056-89ba-75708b18ab84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.910992 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm4rd\" (UniqueName: \"kubernetes.io/projected/bf3d0eba-0bac-4056-89ba-75708b18ab84-kube-api-access-xm4rd\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.911044 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.911061 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf3d0eba-0bac-4056-89ba-75708b18ab84-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.937036 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5s7q"] Oct 01 11:33:16 crc kubenswrapper[4669]: I1001 11:33:16.949038 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5s7q"] Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.635637 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6pmkp" event={"ID":"bf3d0eba-0bac-4056-89ba-75708b18ab84","Type":"ContainerDied","Data":"25dd36e0a0867d6450bac1dd0224984c4002c80bd84e28786af46148fb86b307"} Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.635735 4669 scope.go:117] "RemoveContainer" containerID="fd2323c1496ccf1c805c621a5398b97b9f41f3f3f9aa50fb6ee9681c7b87ee43" Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.635922 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6pmkp" Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.654030 4669 scope.go:117] "RemoveContainer" containerID="41ee1a4fc56f4fc6e64f73d43b070b319a9d2258aa1477fcacaa7250b8dc23a2" Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.660467 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" path="/var/lib/kubelet/pods/b4423fe7-fc2d-4e49-b39f-a9641ce1c28b/volumes" Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.679770 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6pmkp"] Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.683141 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6pmkp"] Oct 01 11:33:17 crc kubenswrapper[4669]: I1001 11:33:17.685965 4669 scope.go:117] "RemoveContainer" containerID="aeb1eeda1cf7231e3442063f4ddc9cfac0a488d7cb844192999ed7abd5f69488" Oct 01 11:33:19 crc kubenswrapper[4669]: I1001 11:33:19.653188 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" path="/var/lib/kubelet/pods/bf3d0eba-0bac-4056-89ba-75708b18ab84/volumes" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.240368 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2l4s"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.241243 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n2l4s" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="registry-server" containerID="cri-o://08474a5302b079f5e0568daadc3759cb4c0480594fa7b7ddbddbf0e93732786c" gracePeriod=30 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.250390 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwdf6"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.250781 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hwdf6" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="registry-server" containerID="cri-o://e7adea424c81018c5f2fa03858830fa3d08c47d007ac1f19d5b52a1ed6c6ea11" gracePeriod=30 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.252617 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dcmjv"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.252872 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" containerID="cri-o://c216b01b6f61a6ac468e6d059a1b07aaaa48e711f9c563e80082200537d9de26" gracePeriod=30 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.260625 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf24n"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.260872 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sf24n" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="registry-server" containerID="cri-o://4a652f988e42c4d1a7d69825106cc701a1ffd87c0df6c8d6820858a0990fd8ad" gracePeriod=30 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.269057 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bzzr"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.269436 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5bzzr" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="registry-server" containerID="cri-o://4f90ce6d0e16868414807f984f91f54c4b61b974f8210e4f85f7bd79109aaf37" gracePeriod=30 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283054 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg5t5"] Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283283 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283296 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283307 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283314 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283324 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283330 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283339 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283345 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283367 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd41b5b5-ff57-43a0-ba85-cf8b1428f88c" containerName="pruner" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283373 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd41b5b5-ff57-43a0-ba85-cf8b1428f88c" containerName="pruner" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283384 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283390 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283398 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283403 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283412 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3114d12c-7a10-4421-97dd-027fb8137aef" containerName="pruner" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283418 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3114d12c-7a10-4421-97dd-027fb8137aef" containerName="pruner" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283430 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283435 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283445 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283452 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283460 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283466 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283508 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283517 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="extract-utilities" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283527 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283566 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="extract-content" Oct 01 11:33:54 crc kubenswrapper[4669]: E1001 11:33:54.283576 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283582 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283690 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b67d961-10c4-45b0-84e0-99ff6afc366a" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283703 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4423fe7-fc2d-4e49-b39f-a9641ce1c28b" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283712 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd41b5b5-ff57-43a0-ba85-cf8b1428f88c" containerName="pruner" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283720 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3114d12c-7a10-4421-97dd-027fb8137aef" containerName="pruner" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283727 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3d0eba-0bac-4056-89ba-75708b18ab84" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.283735 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="356dfcd7-c70a-4494-aeed-89aa3393ecd9" containerName="registry-server" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.284264 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.299230 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg5t5"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.421929 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7693f22a-6758-4b18-8161-c5eb5e27a395-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.422008 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhg2l\" (UniqueName: \"kubernetes.io/projected/7693f22a-6758-4b18-8161-c5eb5e27a395-kube-api-access-jhg2l\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.422127 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7693f22a-6758-4b18-8161-c5eb5e27a395-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.481291 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8hc7m"] Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.523013 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7693f22a-6758-4b18-8161-c5eb5e27a395-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.523483 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7693f22a-6758-4b18-8161-c5eb5e27a395-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.523516 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhg2l\" (UniqueName: \"kubernetes.io/projected/7693f22a-6758-4b18-8161-c5eb5e27a395-kube-api-access-jhg2l\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.525241 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7693f22a-6758-4b18-8161-c5eb5e27a395-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.533462 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7693f22a-6758-4b18-8161-c5eb5e27a395-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.543798 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhg2l\" (UniqueName: \"kubernetes.io/projected/7693f22a-6758-4b18-8161-c5eb5e27a395-kube-api-access-jhg2l\") pod \"marketplace-operator-79b997595-hg5t5\" (UID: \"7693f22a-6758-4b18-8161-c5eb5e27a395\") " pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.702967 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.917260 4669 generic.go:334] "Generic (PLEG): container finished" podID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerID="e7adea424c81018c5f2fa03858830fa3d08c47d007ac1f19d5b52a1ed6c6ea11" exitCode=0 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.917346 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerDied","Data":"e7adea424c81018c5f2fa03858830fa3d08c47d007ac1f19d5b52a1ed6c6ea11"} Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.925763 4669 generic.go:334] "Generic (PLEG): container finished" podID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerID="08474a5302b079f5e0568daadc3759cb4c0480594fa7b7ddbddbf0e93732786c" exitCode=0 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.925956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerDied","Data":"08474a5302b079f5e0568daadc3759cb4c0480594fa7b7ddbddbf0e93732786c"} Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.930143 4669 generic.go:334] "Generic (PLEG): container finished" podID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerID="c216b01b6f61a6ac468e6d059a1b07aaaa48e711f9c563e80082200537d9de26" exitCode=0 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.930363 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" event={"ID":"35ac61d5-a664-4f55-9a9d-c80e3dc18b16","Type":"ContainerDied","Data":"c216b01b6f61a6ac468e6d059a1b07aaaa48e711f9c563e80082200537d9de26"} Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.934261 4669 generic.go:334] "Generic (PLEG): container finished" podID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerID="4a652f988e42c4d1a7d69825106cc701a1ffd87c0df6c8d6820858a0990fd8ad" exitCode=0 Oct 01 11:33:54 crc kubenswrapper[4669]: I1001 11:33:54.934294 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerDied","Data":"4a652f988e42c4d1a7d69825106cc701a1ffd87c0df6c8d6820858a0990fd8ad"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.203314 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.230593 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hg5t5"] Oct 01 11:33:55 crc kubenswrapper[4669]: W1001 11:33:55.247255 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7693f22a_6758_4b18_8161_c5eb5e27a395.slice/crio-514891da8275af7d3b1b74d033e3ef6ad3a830cae57e46aee8874693e43f6d60 WatchSource:0}: Error finding container 514891da8275af7d3b1b74d033e3ef6ad3a830cae57e46aee8874693e43f6d60: Status 404 returned error can't find the container with id 514891da8275af7d3b1b74d033e3ef6ad3a830cae57e46aee8874693e43f6d60 Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.304965 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.328400 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.340662 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xktr\" (UniqueName: \"kubernetes.io/projected/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-kube-api-access-4xktr\") pod \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.340808 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-utilities\") pod \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.340867 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-catalog-content\") pod \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\" (UID: \"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.342411 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-utilities" (OuterVolumeSpecName: "utilities") pod "ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" (UID: "ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.347604 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-kube-api-access-4xktr" (OuterVolumeSpecName: "kube-api-access-4xktr") pod "ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" (UID: "ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0"). InnerVolumeSpecName "kube-api-access-4xktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.381587 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.396537 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" (UID: "ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.441684 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-operator-metrics\") pod \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442191 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tccjw\" (UniqueName: \"kubernetes.io/projected/0b8d4849-98dd-4b1b-90dc-9151e8b17224-kube-api-access-tccjw\") pod \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442302 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-utilities\") pod \"b90bf450-0186-475a-97dd-cb6ad25fb687\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442447 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmgnw\" (UniqueName: \"kubernetes.io/projected/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-kube-api-access-lmgnw\") pod \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442527 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-utilities\") pod \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442602 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-catalog-content\") pod \"b90bf450-0186-475a-97dd-cb6ad25fb687\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442687 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-trusted-ca\") pod \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\" (UID: \"35ac61d5-a664-4f55-9a9d-c80e3dc18b16\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442764 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-catalog-content\") pod \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\" (UID: \"0b8d4849-98dd-4b1b-90dc-9151e8b17224\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.442855 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjk78\" (UniqueName: \"kubernetes.io/projected/b90bf450-0186-475a-97dd-cb6ad25fb687-kube-api-access-vjk78\") pod \"b90bf450-0186-475a-97dd-cb6ad25fb687\" (UID: \"b90bf450-0186-475a-97dd-cb6ad25fb687\") " Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.443171 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-utilities" (OuterVolumeSpecName: "utilities") pod "b90bf450-0186-475a-97dd-cb6ad25fb687" (UID: "b90bf450-0186-475a-97dd-cb6ad25fb687"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.443194 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.443281 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.443296 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xktr\" (UniqueName: \"kubernetes.io/projected/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0-kube-api-access-4xktr\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.443312 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-utilities" (OuterVolumeSpecName: "utilities") pod "0b8d4849-98dd-4b1b-90dc-9151e8b17224" (UID: "0b8d4849-98dd-4b1b-90dc-9151e8b17224"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.444442 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "35ac61d5-a664-4f55-9a9d-c80e3dc18b16" (UID: "35ac61d5-a664-4f55-9a9d-c80e3dc18b16"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.474499 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b90bf450-0186-475a-97dd-cb6ad25fb687" (UID: "b90bf450-0186-475a-97dd-cb6ad25fb687"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.487379 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b8d4849-98dd-4b1b-90dc-9151e8b17224" (UID: "0b8d4849-98dd-4b1b-90dc-9151e8b17224"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.497353 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8d4849-98dd-4b1b-90dc-9151e8b17224-kube-api-access-tccjw" (OuterVolumeSpecName: "kube-api-access-tccjw") pod "0b8d4849-98dd-4b1b-90dc-9151e8b17224" (UID: "0b8d4849-98dd-4b1b-90dc-9151e8b17224"). InnerVolumeSpecName "kube-api-access-tccjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.497835 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "35ac61d5-a664-4f55-9a9d-c80e3dc18b16" (UID: "35ac61d5-a664-4f55-9a9d-c80e3dc18b16"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.500311 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-kube-api-access-lmgnw" (OuterVolumeSpecName: "kube-api-access-lmgnw") pod "35ac61d5-a664-4f55-9a9d-c80e3dc18b16" (UID: "35ac61d5-a664-4f55-9a9d-c80e3dc18b16"). InnerVolumeSpecName "kube-api-access-lmgnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.509512 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90bf450-0186-475a-97dd-cb6ad25fb687-kube-api-access-vjk78" (OuterVolumeSpecName: "kube-api-access-vjk78") pod "b90bf450-0186-475a-97dd-cb6ad25fb687" (UID: "b90bf450-0186-475a-97dd-cb6ad25fb687"). InnerVolumeSpecName "kube-api-access-vjk78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545005 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tccjw\" (UniqueName: \"kubernetes.io/projected/0b8d4849-98dd-4b1b-90dc-9151e8b17224-kube-api-access-tccjw\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545042 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545054 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmgnw\" (UniqueName: \"kubernetes.io/projected/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-kube-api-access-lmgnw\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545067 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545100 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b90bf450-0186-475a-97dd-cb6ad25fb687-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545112 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545122 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8d4849-98dd-4b1b-90dc-9151e8b17224-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545131 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjk78\" (UniqueName: \"kubernetes.io/projected/b90bf450-0186-475a-97dd-cb6ad25fb687-kube-api-access-vjk78\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.545140 4669 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35ac61d5-a664-4f55-9a9d-c80e3dc18b16-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.943735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2l4s" event={"ID":"0b8d4849-98dd-4b1b-90dc-9151e8b17224","Type":"ContainerDied","Data":"92c4bcccbd45d80a42b780568943b89f999ba61ad747ccd6f9f832e4ae826559"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.943782 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2l4s" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.943835 4669 scope.go:117] "RemoveContainer" containerID="08474a5302b079f5e0568daadc3759cb4c0480594fa7b7ddbddbf0e93732786c" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.947957 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwdf6" event={"ID":"ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0","Type":"ContainerDied","Data":"2b0c25ff2a9c9c70be0f2e6a4e3bd7e2a2c2949e284f37b75ff5dfe2e491bb49"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.948383 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwdf6" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.953256 4669 generic.go:334] "Generic (PLEG): container finished" podID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerID="4f90ce6d0e16868414807f984f91f54c4b61b974f8210e4f85f7bd79109aaf37" exitCode=0 Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.953353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerDied","Data":"4f90ce6d0e16868414807f984f91f54c4b61b974f8210e4f85f7bd79109aaf37"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.960739 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.961061 4669 scope.go:117] "RemoveContainer" containerID="427d0110cf2f090a8104120d99740985c7bb63326d980c7e8a52cb28860fdf9e" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.960982 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dcmjv" event={"ID":"35ac61d5-a664-4f55-9a9d-c80e3dc18b16","Type":"ContainerDied","Data":"5c257f282690fb7944a914847efb9c012117bcf7ce8895a7162c7d012d70243c"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.966854 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2l4s"] Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.978646 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n2l4s"] Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.986325 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sf24n" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.987990 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sf24n" event={"ID":"b90bf450-0186-475a-97dd-cb6ad25fb687","Type":"ContainerDied","Data":"1ad7dfe022468d5dfb86e0096c698a667faef5523c53dc3aba55db8f7c55c199"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.995607 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" event={"ID":"7693f22a-6758-4b18-8161-c5eb5e27a395","Type":"ContainerStarted","Data":"13c4d8fc14770e8218882c3ac8cfc10b903e4823f4302e92c4498b05e8a19598"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.995664 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" event={"ID":"7693f22a-6758-4b18-8161-c5eb5e27a395","Type":"ContainerStarted","Data":"514891da8275af7d3b1b74d033e3ef6ad3a830cae57e46aee8874693e43f6d60"} Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.995954 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.998297 4669 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hg5t5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.54:8080/healthz\": dial tcp 10.217.0.54:8080: connect: connection refused" start-of-body= Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.998349 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" podUID="7693f22a-6758-4b18-8161-c5eb5e27a395" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.54:8080/healthz\": dial tcp 10.217.0.54:8080: connect: connection refused" Oct 01 11:33:55 crc kubenswrapper[4669]: I1001 11:33:55.998635 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dcmjv"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.010139 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dcmjv"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.015404 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwdf6"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.017778 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hwdf6"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.022700 4669 scope.go:117] "RemoveContainer" containerID="e36771dab9db36589cddb757be899da08d4ce5c6ea2b0e34e98cfc390e2a9cd1" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.023396 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" podStartSLOduration=2.023372279 podStartE2EDuration="2.023372279s" podCreationTimestamp="2025-10-01 11:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:33:56.022848806 +0000 UTC m=+327.122413783" watchObservedRunningTime="2025-10-01 11:33:56.023372279 +0000 UTC m=+327.122937256" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.038571 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf24n"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.071886 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sf24n"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.077146 4669 scope.go:117] "RemoveContainer" containerID="e7adea424c81018c5f2fa03858830fa3d08c47d007ac1f19d5b52a1ed6c6ea11" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.082652 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.092641 4669 scope.go:117] "RemoveContainer" containerID="792934d93e8496f47e67defb5cd77b2defaf703d608aadd33fedafdb983c8eb2" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.127506 4669 scope.go:117] "RemoveContainer" containerID="cff7aa9a4509e7a30757711d6805a6807d49255512c73b8274a04239b818c737" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.150199 4669 scope.go:117] "RemoveContainer" containerID="c216b01b6f61a6ac468e6d059a1b07aaaa48e711f9c563e80082200537d9de26" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.154384 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-catalog-content\") pod \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.154450 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-utilities\") pod \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.154493 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzvxj\" (UniqueName: \"kubernetes.io/projected/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-kube-api-access-nzvxj\") pod \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\" (UID: \"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb\") " Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.155453 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-utilities" (OuterVolumeSpecName: "utilities") pod "59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" (UID: "59f36b7c-ac0a-4ca4-90e3-2dfd686760fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.161336 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-kube-api-access-nzvxj" (OuterVolumeSpecName: "kube-api-access-nzvxj") pod "59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" (UID: "59f36b7c-ac0a-4ca4-90e3-2dfd686760fb"). InnerVolumeSpecName "kube-api-access-nzvxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.168933 4669 scope.go:117] "RemoveContainer" containerID="4a652f988e42c4d1a7d69825106cc701a1ffd87c0df6c8d6820858a0990fd8ad" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.184428 4669 scope.go:117] "RemoveContainer" containerID="c7c24d89b46ecd7f19910f77216500c0fae08582fb17ff94f8b755fbcb644374" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.197532 4669 scope.go:117] "RemoveContainer" containerID="6fb7b1e21dcb53cab148677b9fe69372542a364c3ffdaa9844eb09258acfdfc4" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.233114 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" (UID: "59f36b7c-ac0a-4ca4-90e3-2dfd686760fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.256738 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzvxj\" (UniqueName: \"kubernetes.io/projected/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-kube-api-access-nzvxj\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.256792 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.256806 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.465788 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz57"] Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.466991 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.467198 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.467332 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.467445 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.467641 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.467816 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.468126 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.468248 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.468360 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.468469 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.468583 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.468708 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.468840 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.468958 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="extract-content" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.469100 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.469218 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.469333 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.469441 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.469568 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.469678 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.469793 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.469900 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.470203 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.470323 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: E1001 11:33:56.470453 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.470564 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="extract-utilities" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.470841 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.470977 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.471141 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.471283 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" containerName="registry-server" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.471403 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" containerName="marketplace-operator" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.474802 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.477670 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.477762 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz57"] Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.562943 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fjp\" (UniqueName: \"kubernetes.io/projected/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-kube-api-access-42fjp\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.563158 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-catalog-content\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.563568 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-utilities\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.665571 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-utilities\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.665803 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42fjp\" (UniqueName: \"kubernetes.io/projected/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-kube-api-access-42fjp\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.665926 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-catalog-content\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.666625 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-catalog-content\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.666655 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-utilities\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.685242 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fjp\" (UniqueName: \"kubernetes.io/projected/9eab31d8-034e-464c-a5c8-f24b4dcbccb7-kube-api-access-42fjp\") pod \"redhat-marketplace-ttz57\" (UID: \"9eab31d8-034e-464c-a5c8-f24b4dcbccb7\") " pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:56 crc kubenswrapper[4669]: I1001 11:33:56.792257 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.007801 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzzr" event={"ID":"59f36b7c-ac0a-4ca4-90e3-2dfd686760fb","Type":"ContainerDied","Data":"047ee96be580d7f80bd5cac485521134f067823abf7e9d0dabd5d78ab11035a7"} Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.007894 4669 scope.go:117] "RemoveContainer" containerID="4f90ce6d0e16868414807f984f91f54c4b61b974f8210e4f85f7bd79109aaf37" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.007905 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzzr" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.019182 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hg5t5" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.035632 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttz57"] Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.045458 4669 scope.go:117] "RemoveContainer" containerID="39272a89e432cd886731b92af37685403485cc483a13b91852dcc61c32bdc07f" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.068329 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bzzr"] Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.068962 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5bzzr"] Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.090588 4669 scope.go:117] "RemoveContainer" containerID="f1d13bb1336d7210e7a66c43fd39b9c3aefe50e3a4655d40f75c47741693c89d" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.656472 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8d4849-98dd-4b1b-90dc-9151e8b17224" path="/var/lib/kubelet/pods/0b8d4849-98dd-4b1b-90dc-9151e8b17224/volumes" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.657413 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ac61d5-a664-4f55-9a9d-c80e3dc18b16" path="/var/lib/kubelet/pods/35ac61d5-a664-4f55-9a9d-c80e3dc18b16/volumes" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.657853 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f36b7c-ac0a-4ca4-90e3-2dfd686760fb" path="/var/lib/kubelet/pods/59f36b7c-ac0a-4ca4-90e3-2dfd686760fb/volumes" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.658700 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90bf450-0186-475a-97dd-cb6ad25fb687" path="/var/lib/kubelet/pods/b90bf450-0186-475a-97dd-cb6ad25fb687/volumes" Oct 01 11:33:57 crc kubenswrapper[4669]: I1001 11:33:57.659787 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0" path="/var/lib/kubelet/pods/ed0742b0-d8c4-4b2a-b0d1-2ea0bb44d5e0/volumes" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.017044 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz57" event={"ID":"9eab31d8-034e-464c-a5c8-f24b4dcbccb7","Type":"ContainerStarted","Data":"d646a1252ee91950c9c6f73b8180c5324b9402a1a8ee86c1ca5f2f0b865d9f9a"} Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.664252 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ntnfv"] Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.675491 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.676717 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntnfv"] Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.679566 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.795942 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5df8eb3-5517-4e0c-af77-565bddc9fe52-catalog-content\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.796101 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5df8eb3-5517-4e0c-af77-565bddc9fe52-utilities\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.796149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b6g2\" (UniqueName: \"kubernetes.io/projected/a5df8eb3-5517-4e0c-af77-565bddc9fe52-kube-api-access-5b6g2\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.860400 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f2vsx"] Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.861697 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.864729 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.881592 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2vsx"] Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.897129 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b6g2\" (UniqueName: \"kubernetes.io/projected/a5df8eb3-5517-4e0c-af77-565bddc9fe52-kube-api-access-5b6g2\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.897225 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5df8eb3-5517-4e0c-af77-565bddc9fe52-catalog-content\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.897261 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5df8eb3-5517-4e0c-af77-565bddc9fe52-utilities\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.897818 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5df8eb3-5517-4e0c-af77-565bddc9fe52-catalog-content\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.897852 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5df8eb3-5517-4e0c-af77-565bddc9fe52-utilities\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.922636 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b6g2\" (UniqueName: \"kubernetes.io/projected/a5df8eb3-5517-4e0c-af77-565bddc9fe52-kube-api-access-5b6g2\") pod \"community-operators-ntnfv\" (UID: \"a5df8eb3-5517-4e0c-af77-565bddc9fe52\") " pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.998038 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.998314 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z7c5\" (UniqueName: \"kubernetes.io/projected/2e4864c1-9d72-45e1-a602-fe0a6687811c-kube-api-access-4z7c5\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.998436 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4864c1-9d72-45e1-a602-fe0a6687811c-utilities\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:58 crc kubenswrapper[4669]: I1001 11:33:58.998471 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4864c1-9d72-45e1-a602-fe0a6687811c-catalog-content\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.027828 4669 generic.go:334] "Generic (PLEG): container finished" podID="9eab31d8-034e-464c-a5c8-f24b4dcbccb7" containerID="52fd171e7a5213c362b408ed1cc1186cae8b3b4558ec2e7e89bede6cf051222f" exitCode=0 Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.027886 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz57" event={"ID":"9eab31d8-034e-464c-a5c8-f24b4dcbccb7","Type":"ContainerDied","Data":"52fd171e7a5213c362b408ed1cc1186cae8b3b4558ec2e7e89bede6cf051222f"} Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.099716 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z7c5\" (UniqueName: \"kubernetes.io/projected/2e4864c1-9d72-45e1-a602-fe0a6687811c-kube-api-access-4z7c5\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.100280 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4864c1-9d72-45e1-a602-fe0a6687811c-utilities\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.100319 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4864c1-9d72-45e1-a602-fe0a6687811c-catalog-content\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.101109 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4864c1-9d72-45e1-a602-fe0a6687811c-catalog-content\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.101249 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4864c1-9d72-45e1-a602-fe0a6687811c-utilities\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.120381 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z7c5\" (UniqueName: \"kubernetes.io/projected/2e4864c1-9d72-45e1-a602-fe0a6687811c-kube-api-access-4z7c5\") pod \"certified-operators-f2vsx\" (UID: \"2e4864c1-9d72-45e1-a602-fe0a6687811c\") " pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.181400 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.243988 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntnfv"] Oct 01 11:33:59 crc kubenswrapper[4669]: W1001 11:33:59.255628 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5df8eb3_5517_4e0c_af77_565bddc9fe52.slice/crio-8322a258798a4ce2b1013ade486e1ff0228530e350224758703c3987d6121e0c WatchSource:0}: Error finding container 8322a258798a4ce2b1013ade486e1ff0228530e350224758703c3987d6121e0c: Status 404 returned error can't find the container with id 8322a258798a4ce2b1013ade486e1ff0228530e350224758703c3987d6121e0c Oct 01 11:33:59 crc kubenswrapper[4669]: W1001 11:33:59.391986 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4864c1_9d72_45e1_a602_fe0a6687811c.slice/crio-2b86572fd300b3ce82971e3c27eb83da2921e1670c4373be745d2ae7d6fd1995 WatchSource:0}: Error finding container 2b86572fd300b3ce82971e3c27eb83da2921e1670c4373be745d2ae7d6fd1995: Status 404 returned error can't find the container with id 2b86572fd300b3ce82971e3c27eb83da2921e1670c4373be745d2ae7d6fd1995 Oct 01 11:33:59 crc kubenswrapper[4669]: I1001 11:33:59.392913 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2vsx"] Oct 01 11:34:00 crc kubenswrapper[4669]: I1001 11:34:00.036715 4669 generic.go:334] "Generic (PLEG): container finished" podID="2e4864c1-9d72-45e1-a602-fe0a6687811c" containerID="e7422f98ff205f9d3d78dc37c9a51f74571ff0aec41dac6825b651f5d80b09ec" exitCode=0 Oct 01 11:34:00 crc kubenswrapper[4669]: I1001 11:34:00.036986 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2vsx" event={"ID":"2e4864c1-9d72-45e1-a602-fe0a6687811c","Type":"ContainerDied","Data":"e7422f98ff205f9d3d78dc37c9a51f74571ff0aec41dac6825b651f5d80b09ec"} Oct 01 11:34:00 crc kubenswrapper[4669]: I1001 11:34:00.037157 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2vsx" event={"ID":"2e4864c1-9d72-45e1-a602-fe0a6687811c","Type":"ContainerStarted","Data":"2b86572fd300b3ce82971e3c27eb83da2921e1670c4373be745d2ae7d6fd1995"} Oct 01 11:34:00 crc kubenswrapper[4669]: I1001 11:34:00.038846 4669 generic.go:334] "Generic (PLEG): container finished" podID="a5df8eb3-5517-4e0c-af77-565bddc9fe52" containerID="b2a13eb10adb2752f1fc66ece92f9615adaf0cd9cf576f65d4cdde42fbe43c19" exitCode=0 Oct 01 11:34:00 crc kubenswrapper[4669]: I1001 11:34:00.038871 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntnfv" event={"ID":"a5df8eb3-5517-4e0c-af77-565bddc9fe52","Type":"ContainerDied","Data":"b2a13eb10adb2752f1fc66ece92f9615adaf0cd9cf576f65d4cdde42fbe43c19"} Oct 01 11:34:00 crc kubenswrapper[4669]: I1001 11:34:00.038888 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntnfv" event={"ID":"a5df8eb3-5517-4e0c-af77-565bddc9fe52","Type":"ContainerStarted","Data":"8322a258798a4ce2b1013ade486e1ff0228530e350224758703c3987d6121e0c"} Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.062149 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9gtjj"] Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.063548 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.065722 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.086965 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gtjj"] Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.134089 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d950283-1340-49ba-8ddb-35326c3f375e-catalog-content\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.134138 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d950283-1340-49ba-8ddb-35326c3f375e-utilities\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.134172 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72hq4\" (UniqueName: \"kubernetes.io/projected/1d950283-1340-49ba-8ddb-35326c3f375e-kube-api-access-72hq4\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.235423 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72hq4\" (UniqueName: \"kubernetes.io/projected/1d950283-1340-49ba-8ddb-35326c3f375e-kube-api-access-72hq4\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.235565 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d950283-1340-49ba-8ddb-35326c3f375e-catalog-content\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.235607 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d950283-1340-49ba-8ddb-35326c3f375e-utilities\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.236246 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d950283-1340-49ba-8ddb-35326c3f375e-catalog-content\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.236569 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d950283-1340-49ba-8ddb-35326c3f375e-utilities\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.261370 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72hq4\" (UniqueName: \"kubernetes.io/projected/1d950283-1340-49ba-8ddb-35326c3f375e-kube-api-access-72hq4\") pod \"redhat-operators-9gtjj\" (UID: \"1d950283-1340-49ba-8ddb-35326c3f375e\") " pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:01 crc kubenswrapper[4669]: I1001 11:34:01.387312 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:03 crc kubenswrapper[4669]: I1001 11:34:03.254243 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gtjj"] Oct 01 11:34:04 crc kubenswrapper[4669]: I1001 11:34:04.063901 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gtjj" event={"ID":"1d950283-1340-49ba-8ddb-35326c3f375e","Type":"ContainerStarted","Data":"b778942a1c12a56c6362556bc874a7e3e62c24040fa2b1fb54f22044c652e933"} Oct 01 11:34:06 crc kubenswrapper[4669]: I1001 11:34:06.078756 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntnfv" event={"ID":"a5df8eb3-5517-4e0c-af77-565bddc9fe52","Type":"ContainerStarted","Data":"d7537b2faa0cea74ea84d0514e4419756d317dc63962be78a0fbcd938d55d446"} Oct 01 11:34:06 crc kubenswrapper[4669]: I1001 11:34:06.082456 4669 generic.go:334] "Generic (PLEG): container finished" podID="9eab31d8-034e-464c-a5c8-f24b4dcbccb7" containerID="ac81bdcf939008a02cb5ef6da9790debf8de6f0cca3a2f6dda6c4ed51865b39a" exitCode=0 Oct 01 11:34:06 crc kubenswrapper[4669]: I1001 11:34:06.082521 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz57" event={"ID":"9eab31d8-034e-464c-a5c8-f24b4dcbccb7","Type":"ContainerDied","Data":"ac81bdcf939008a02cb5ef6da9790debf8de6f0cca3a2f6dda6c4ed51865b39a"} Oct 01 11:34:06 crc kubenswrapper[4669]: I1001 11:34:06.085441 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2vsx" event={"ID":"2e4864c1-9d72-45e1-a602-fe0a6687811c","Type":"ContainerStarted","Data":"34936b8b1fed98ffa16b025d6854729e9877cb115ab17e1a687a5d38c5721c45"} Oct 01 11:34:06 crc kubenswrapper[4669]: I1001 11:34:06.087404 4669 generic.go:334] "Generic (PLEG): container finished" podID="1d950283-1340-49ba-8ddb-35326c3f375e" containerID="f3aa0b5b8fbc9ae2ec55a709c2cb49a2083a34ffaa02789c7ff6c9d0f0789e2d" exitCode=0 Oct 01 11:34:06 crc kubenswrapper[4669]: I1001 11:34:06.087469 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gtjj" event={"ID":"1d950283-1340-49ba-8ddb-35326c3f375e","Type":"ContainerDied","Data":"f3aa0b5b8fbc9ae2ec55a709c2cb49a2083a34ffaa02789c7ff6c9d0f0789e2d"} Oct 01 11:34:07 crc kubenswrapper[4669]: I1001 11:34:07.095343 4669 generic.go:334] "Generic (PLEG): container finished" podID="a5df8eb3-5517-4e0c-af77-565bddc9fe52" containerID="d7537b2faa0cea74ea84d0514e4419756d317dc63962be78a0fbcd938d55d446" exitCode=0 Oct 01 11:34:07 crc kubenswrapper[4669]: I1001 11:34:07.095460 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntnfv" event={"ID":"a5df8eb3-5517-4e0c-af77-565bddc9fe52","Type":"ContainerDied","Data":"d7537b2faa0cea74ea84d0514e4419756d317dc63962be78a0fbcd938d55d446"} Oct 01 11:34:07 crc kubenswrapper[4669]: I1001 11:34:07.098063 4669 generic.go:334] "Generic (PLEG): container finished" podID="2e4864c1-9d72-45e1-a602-fe0a6687811c" containerID="34936b8b1fed98ffa16b025d6854729e9877cb115ab17e1a687a5d38c5721c45" exitCode=0 Oct 01 11:34:07 crc kubenswrapper[4669]: I1001 11:34:07.098187 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2vsx" event={"ID":"2e4864c1-9d72-45e1-a602-fe0a6687811c","Type":"ContainerDied","Data":"34936b8b1fed98ffa16b025d6854729e9877cb115ab17e1a687a5d38c5721c45"} Oct 01 11:34:10 crc kubenswrapper[4669]: I1001 11:34:10.124012 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttz57" event={"ID":"9eab31d8-034e-464c-a5c8-f24b4dcbccb7","Type":"ContainerStarted","Data":"f2c1cb1d137055a7b7c5998182d70e15f7e91c30b4bd9160b16a39d8a99767c7"} Oct 01 11:34:10 crc kubenswrapper[4669]: I1001 11:34:10.136608 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntnfv" event={"ID":"a5df8eb3-5517-4e0c-af77-565bddc9fe52","Type":"ContainerStarted","Data":"a59038bd734e13c9e388dfc79483a285171094713ce8f04fdfeecac0413ffa16"} Oct 01 11:34:10 crc kubenswrapper[4669]: I1001 11:34:10.156269 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ttz57" podStartSLOduration=4.946226361 podStartE2EDuration="14.156247892s" podCreationTimestamp="2025-10-01 11:33:56 +0000 UTC" firstStartedPulling="2025-10-01 11:33:59.03097788 +0000 UTC m=+330.130542847" lastFinishedPulling="2025-10-01 11:34:08.240999411 +0000 UTC m=+339.340564378" observedRunningTime="2025-10-01 11:34:10.153750031 +0000 UTC m=+341.253315018" watchObservedRunningTime="2025-10-01 11:34:10.156247892 +0000 UTC m=+341.255812869" Oct 01 11:34:10 crc kubenswrapper[4669]: I1001 11:34:10.178956 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ntnfv" podStartSLOduration=3.195421249 podStartE2EDuration="12.178928751s" podCreationTimestamp="2025-10-01 11:33:58 +0000 UTC" firstStartedPulling="2025-10-01 11:34:00.03977323 +0000 UTC m=+331.139338217" lastFinishedPulling="2025-10-01 11:34:09.023280742 +0000 UTC m=+340.122845719" observedRunningTime="2025-10-01 11:34:10.177427973 +0000 UTC m=+341.276992970" watchObservedRunningTime="2025-10-01 11:34:10.178928751 +0000 UTC m=+341.278493728" Oct 01 11:34:11 crc kubenswrapper[4669]: I1001 11:34:11.147332 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gtjj" event={"ID":"1d950283-1340-49ba-8ddb-35326c3f375e","Type":"ContainerStarted","Data":"cefade995e3489921ae104e70081ef9df87a985d39750d6be22d13e3985cd31d"} Oct 01 11:34:11 crc kubenswrapper[4669]: I1001 11:34:11.152999 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2vsx" event={"ID":"2e4864c1-9d72-45e1-a602-fe0a6687811c","Type":"ContainerStarted","Data":"723f57a525fcbc363cca313ac3493cea4044c4a0463c756b04038958187d8cf2"} Oct 01 11:34:11 crc kubenswrapper[4669]: I1001 11:34:11.198502 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f2vsx" podStartSLOduration=4.097655935 podStartE2EDuration="13.198474727s" podCreationTimestamp="2025-10-01 11:33:58 +0000 UTC" firstStartedPulling="2025-10-01 11:34:00.040159359 +0000 UTC m=+331.139724346" lastFinishedPulling="2025-10-01 11:34:09.140978161 +0000 UTC m=+340.240543138" observedRunningTime="2025-10-01 11:34:11.19006913 +0000 UTC m=+342.289634137" watchObservedRunningTime="2025-10-01 11:34:11.198474727 +0000 UTC m=+342.298039724" Oct 01 11:34:12 crc kubenswrapper[4669]: I1001 11:34:12.160770 4669 generic.go:334] "Generic (PLEG): container finished" podID="1d950283-1340-49ba-8ddb-35326c3f375e" containerID="cefade995e3489921ae104e70081ef9df87a985d39750d6be22d13e3985cd31d" exitCode=0 Oct 01 11:34:12 crc kubenswrapper[4669]: I1001 11:34:12.160876 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gtjj" event={"ID":"1d950283-1340-49ba-8ddb-35326c3f375e","Type":"ContainerDied","Data":"cefade995e3489921ae104e70081ef9df87a985d39750d6be22d13e3985cd31d"} Oct 01 11:34:15 crc kubenswrapper[4669]: I1001 11:34:15.183438 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gtjj" event={"ID":"1d950283-1340-49ba-8ddb-35326c3f375e","Type":"ContainerStarted","Data":"2f7a7c2c9998c79070805fa1777055c9d228f8eb7efdaec0641523bb8ce0e0f8"} Oct 01 11:34:15 crc kubenswrapper[4669]: I1001 11:34:15.209531 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9gtjj" podStartSLOduration=7.482248333 podStartE2EDuration="14.209512535s" podCreationTimestamp="2025-10-01 11:34:01 +0000 UTC" firstStartedPulling="2025-10-01 11:34:07.100381303 +0000 UTC m=+338.199946280" lastFinishedPulling="2025-10-01 11:34:13.827645485 +0000 UTC m=+344.927210482" observedRunningTime="2025-10-01 11:34:15.205012824 +0000 UTC m=+346.304577801" watchObservedRunningTime="2025-10-01 11:34:15.209512535 +0000 UTC m=+346.309077512" Oct 01 11:34:16 crc kubenswrapper[4669]: I1001 11:34:16.793024 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:34:16 crc kubenswrapper[4669]: I1001 11:34:16.794167 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:34:16 crc kubenswrapper[4669]: I1001 11:34:16.834918 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:34:17 crc kubenswrapper[4669]: I1001 11:34:17.246140 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ttz57" Oct 01 11:34:18 crc kubenswrapper[4669]: I1001 11:34:18.999161 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:34:18 crc kubenswrapper[4669]: I1001 11:34:18.999254 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.069660 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.182461 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.182562 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.251768 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.267007 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ntnfv" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.312155 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f2vsx" Oct 01 11:34:19 crc kubenswrapper[4669]: I1001 11:34:19.507809 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" containerID="cri-o://faf4d155e6b0f52ff5563396b5e941ee5df12f9b7e30b42c09f7e7e4837afc4c" gracePeriod=15 Oct 01 11:34:21 crc kubenswrapper[4669]: I1001 11:34:21.387625 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:21 crc kubenswrapper[4669]: I1001 11:34:21.388260 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:21 crc kubenswrapper[4669]: I1001 11:34:21.450239 4669 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8hc7m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Oct 01 11:34:21 crc kubenswrapper[4669]: I1001 11:34:21.450380 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Oct 01 11:34:21 crc kubenswrapper[4669]: I1001 11:34:21.451887 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:22 crc kubenswrapper[4669]: I1001 11:34:22.239294 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerID="faf4d155e6b0f52ff5563396b5e941ee5df12f9b7e30b42c09f7e7e4837afc4c" exitCode=0 Oct 01 11:34:22 crc kubenswrapper[4669]: I1001 11:34:22.239405 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" event={"ID":"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e","Type":"ContainerDied","Data":"faf4d155e6b0f52ff5563396b5e941ee5df12f9b7e30b42c09f7e7e4837afc4c"} Oct 01 11:34:22 crc kubenswrapper[4669]: I1001 11:34:22.309383 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9gtjj" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.187560 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.235769 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b98686f74-vqfrn"] Oct 01 11:34:23 crc kubenswrapper[4669]: E1001 11:34:23.236557 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.236787 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.237187 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" containerName="oauth-openshift" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.238164 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.248850 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" event={"ID":"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e","Type":"ContainerDied","Data":"ad520c74ea111e60b9af7480fec4a63baac8559b51ca99cf7d150b9a3eca0626"} Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.248945 4669 scope.go:117] "RemoveContainer" containerID="faf4d155e6b0f52ff5563396b5e941ee5df12f9b7e30b42c09f7e7e4837afc4c" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.248875 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8hc7m" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.275801 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-provider-selection\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.275873 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-service-ca\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.275920 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-session\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.275963 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-policies\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.275992 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-router-certs\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276010 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-trusted-ca-bundle\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276044 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-dir\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276064 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-error\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276098 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-ocp-branding-template\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276177 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-idp-0-file-data\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276212 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-serving-cert\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276237 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-login\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276260 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzcj7\" (UniqueName: \"kubernetes.io/projected/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-kube-api-access-qzcj7\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276298 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-cliconfig\") pod \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\" (UID: \"d5ea53db-a0f7-489e-82c5-4ef995cf6d5e\") " Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.276655 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.277865 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.279999 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b98686f74-vqfrn"] Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.280627 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.281126 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.282384 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.288743 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-kube-api-access-qzcj7" (OuterVolumeSpecName: "kube-api-access-qzcj7") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "kube-api-access-qzcj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.288743 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.288978 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.290029 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.299406 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.300057 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.303406 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.303907 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.315413 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" (UID: "d5ea53db-a0f7-489e-82c5-4ef995cf6d5e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378138 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378228 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378270 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-login\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378318 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-error\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378357 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378437 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378486 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378568 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-audit-policies\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378621 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e52251a2-ab64-46d8-8acf-044cfafc8920-audit-dir\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378684 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-session\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378736 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378815 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378883 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.378998 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxql\" (UniqueName: \"kubernetes.io/projected/e52251a2-ab64-46d8-8acf-044cfafc8920-kube-api-access-zgxql\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379176 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379248 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379338 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379370 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzcj7\" (UniqueName: \"kubernetes.io/projected/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-kube-api-access-qzcj7\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379401 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379433 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379454 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379471 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379489 4669 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379510 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379527 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379547 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379565 4669 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.379585 4669 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.480390 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-audit-policies\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e52251a2-ab64-46d8-8acf-044cfafc8920-audit-dir\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481413 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-session\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481470 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481523 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481507 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e52251a2-ab64-46d8-8acf-044cfafc8920-audit-dir\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481571 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481705 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxql\" (UniqueName: \"kubernetes.io/projected/e52251a2-ab64-46d8-8acf-044cfafc8920-kube-api-access-zgxql\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481801 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481846 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481869 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-login\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481915 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-error\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.481967 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.482010 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.482191 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-audit-policies\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.483170 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.484586 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.484977 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.486325 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-login\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.486326 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-session\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.487123 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.487268 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.487446 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.488181 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.488598 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-user-template-error\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.488967 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e52251a2-ab64-46d8-8acf-044cfafc8920-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.504807 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxql\" (UniqueName: \"kubernetes.io/projected/e52251a2-ab64-46d8-8acf-044cfafc8920-kube-api-access-zgxql\") pod \"oauth-openshift-6b98686f74-vqfrn\" (UID: \"e52251a2-ab64-46d8-8acf-044cfafc8920\") " pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.560413 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.580681 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8hc7m"] Oct 01 11:34:23 crc kubenswrapper[4669]: I1001 11:34:23.583434 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8hc7m"] Oct 01 11:34:24 crc kubenswrapper[4669]: I1001 11:34:24.000285 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ea53db-a0f7-489e-82c5-4ef995cf6d5e" path="/var/lib/kubelet/pods/d5ea53db-a0f7-489e-82c5-4ef995cf6d5e/volumes" Oct 01 11:34:24 crc kubenswrapper[4669]: I1001 11:34:24.243580 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b98686f74-vqfrn"] Oct 01 11:34:24 crc kubenswrapper[4669]: W1001 11:34:24.254885 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52251a2_ab64_46d8_8acf_044cfafc8920.slice/crio-a6f01acc2c709a971bc746a6acad025a224160d99d40f733b3926146503bfc39 WatchSource:0}: Error finding container a6f01acc2c709a971bc746a6acad025a224160d99d40f733b3926146503bfc39: Status 404 returned error can't find the container with id a6f01acc2c709a971bc746a6acad025a224160d99d40f733b3926146503bfc39 Oct 01 11:34:25 crc kubenswrapper[4669]: I1001 11:34:25.266473 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" event={"ID":"e52251a2-ab64-46d8-8acf-044cfafc8920","Type":"ContainerStarted","Data":"a6f01acc2c709a971bc746a6acad025a224160d99d40f733b3926146503bfc39"} Oct 01 11:34:31 crc kubenswrapper[4669]: I1001 11:34:31.313384 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" event={"ID":"e52251a2-ab64-46d8-8acf-044cfafc8920","Type":"ContainerStarted","Data":"514f6ae1458acf18998469772cfc6804621006f77961214d9072119822275e6e"} Oct 01 11:34:31 crc kubenswrapper[4669]: I1001 11:34:31.863325 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:34:31 crc kubenswrapper[4669]: I1001 11:34:31.863862 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:34:33 crc kubenswrapper[4669]: I1001 11:34:33.328780 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:33 crc kubenswrapper[4669]: I1001 11:34:33.337472 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" Oct 01 11:34:33 crc kubenswrapper[4669]: I1001 11:34:33.375878 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b98686f74-vqfrn" podStartSLOduration=39.375784212 podStartE2EDuration="39.375784212s" podCreationTimestamp="2025-10-01 11:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:34:33.3687901 +0000 UTC m=+364.468355107" watchObservedRunningTime="2025-10-01 11:34:33.375784212 +0000 UTC m=+364.475349269" Oct 01 11:35:01 crc kubenswrapper[4669]: I1001 11:35:01.863713 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:35:01 crc kubenswrapper[4669]: I1001 11:35:01.864432 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:35:31 crc kubenswrapper[4669]: I1001 11:35:31.864576 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:35:31 crc kubenswrapper[4669]: I1001 11:35:31.865598 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:35:31 crc kubenswrapper[4669]: I1001 11:35:31.865691 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:35:31 crc kubenswrapper[4669]: I1001 11:35:31.866917 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8027885e355d02196c881ebc15cce3dfddbca8c6fa333e055455ca80503be475"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:35:31 crc kubenswrapper[4669]: I1001 11:35:31.867023 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://8027885e355d02196c881ebc15cce3dfddbca8c6fa333e055455ca80503be475" gracePeriod=600 Oct 01 11:35:32 crc kubenswrapper[4669]: I1001 11:35:32.775380 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"8027885e355d02196c881ebc15cce3dfddbca8c6fa333e055455ca80503be475"} Oct 01 11:35:32 crc kubenswrapper[4669]: I1001 11:35:32.775481 4669 scope.go:117] "RemoveContainer" containerID="a85b7f046d251d90fe871f0edfc08165ecf34d7d4e335d00fc03035e3ec59054" Oct 01 11:35:32 crc kubenswrapper[4669]: I1001 11:35:32.775629 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="8027885e355d02196c881ebc15cce3dfddbca8c6fa333e055455ca80503be475" exitCode=0 Oct 01 11:35:33 crc kubenswrapper[4669]: I1001 11:35:33.785949 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"749997bec659c722d86e3b88621cbc0e2b2ce7eed205c06ca6b7b63eaf908655"} Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.556725 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ttzp"] Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.558746 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.623884 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ttzp"] Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753346 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0020eb1e-e7ec-4555-995b-7f669f04c529-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753439 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5b8\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-kube-api-access-jj5b8\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753463 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0020eb1e-e7ec-4555-995b-7f669f04c529-registry-certificates\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753499 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753660 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-bound-sa-token\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753787 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-registry-tls\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753863 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0020eb1e-e7ec-4555-995b-7f669f04c529-trusted-ca\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.753902 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0020eb1e-e7ec-4555-995b-7f669f04c529-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.773093 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.855756 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-registry-tls\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.855867 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0020eb1e-e7ec-4555-995b-7f669f04c529-trusted-ca\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.855927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0020eb1e-e7ec-4555-995b-7f669f04c529-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.856046 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0020eb1e-e7ec-4555-995b-7f669f04c529-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.856140 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5b8\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-kube-api-access-jj5b8\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.856191 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0020eb1e-e7ec-4555-995b-7f669f04c529-registry-certificates\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.856255 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-bound-sa-token\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.857055 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0020eb1e-e7ec-4555-995b-7f669f04c529-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.858740 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0020eb1e-e7ec-4555-995b-7f669f04c529-trusted-ca\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.860271 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0020eb1e-e7ec-4555-995b-7f669f04c529-registry-certificates\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.869266 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-registry-tls\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.869912 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0020eb1e-e7ec-4555-995b-7f669f04c529-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.893709 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5b8\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-kube-api-access-jj5b8\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:33 crc kubenswrapper[4669]: I1001 11:36:33.894177 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0020eb1e-e7ec-4555-995b-7f669f04c529-bound-sa-token\") pod \"image-registry-66df7c8f76-8ttzp\" (UID: \"0020eb1e-e7ec-4555-995b-7f669f04c529\") " pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:34 crc kubenswrapper[4669]: I1001 11:36:34.179134 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:34 crc kubenswrapper[4669]: I1001 11:36:34.413917 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8ttzp"] Oct 01 11:36:35 crc kubenswrapper[4669]: I1001 11:36:35.217203 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" event={"ID":"0020eb1e-e7ec-4555-995b-7f669f04c529","Type":"ContainerStarted","Data":"150c9c685c155fe3c8b95e1fc67de584609ec13b4414acc69f5ff066f6613b29"} Oct 01 11:36:35 crc kubenswrapper[4669]: I1001 11:36:35.217643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" event={"ID":"0020eb1e-e7ec-4555-995b-7f669f04c529","Type":"ContainerStarted","Data":"d4304d313f7606cda3accb12e8bfd2655aee196ce6b2868def64bab633b378cc"} Oct 01 11:36:35 crc kubenswrapper[4669]: I1001 11:36:35.217681 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:35 crc kubenswrapper[4669]: I1001 11:36:35.253651 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" podStartSLOduration=2.253619175 podStartE2EDuration="2.253619175s" podCreationTimestamp="2025-10-01 11:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:36:35.244894807 +0000 UTC m=+486.344459824" watchObservedRunningTime="2025-10-01 11:36:35.253619175 +0000 UTC m=+486.353184162" Oct 01 11:36:54 crc kubenswrapper[4669]: I1001 11:36:54.189054 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8ttzp" Oct 01 11:36:54 crc kubenswrapper[4669]: I1001 11:36:54.275334 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s7dqt"] Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.336909 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" podUID="e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" containerName="registry" containerID="cri-o://8e199821b5d5ad2d3a29328637d5e13bbdd1b40cfc62c6518fd8a9064895d91f" gracePeriod=30 Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.544045 4669 generic.go:334] "Generic (PLEG): container finished" podID="e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" containerID="8e199821b5d5ad2d3a29328637d5e13bbdd1b40cfc62c6518fd8a9064895d91f" exitCode=0 Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.544308 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" event={"ID":"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e","Type":"ContainerDied","Data":"8e199821b5d5ad2d3a29328637d5e13bbdd1b40cfc62c6518fd8a9064895d91f"} Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.774314 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863744 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863813 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrxj8\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-kube-api-access-rrxj8\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863852 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-trusted-ca\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863884 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-bound-sa-token\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863930 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-ca-trust-extracted\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863963 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-installation-pull-secrets\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.863998 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-tls\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.864040 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-certificates\") pod \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\" (UID: \"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e\") " Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.865961 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.867615 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.871775 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.871902 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.872283 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-kube-api-access-rrxj8" (OuterVolumeSpecName: "kube-api-access-rrxj8") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "kube-api-access-rrxj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.873607 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.877236 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.886620 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" (UID: "e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.965630 4669 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.965988 4669 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.966186 4669 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.966316 4669 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.966446 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrxj8\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-kube-api-access-rrxj8\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.966571 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:19 crc kubenswrapper[4669]: I1001 11:37:19.966680 4669 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 11:37:20 crc kubenswrapper[4669]: I1001 11:37:20.554845 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" Oct 01 11:37:20 crc kubenswrapper[4669]: I1001 11:37:20.554836 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-s7dqt" event={"ID":"e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e","Type":"ContainerDied","Data":"cfa5476fd0ffdf7abf4986018c10bb42df2803a4caca90cd9387f3dd7f170eb0"} Oct 01 11:37:20 crc kubenswrapper[4669]: I1001 11:37:20.555139 4669 scope.go:117] "RemoveContainer" containerID="8e199821b5d5ad2d3a29328637d5e13bbdd1b40cfc62c6518fd8a9064895d91f" Oct 01 11:37:20 crc kubenswrapper[4669]: I1001 11:37:20.604224 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s7dqt"] Oct 01 11:37:20 crc kubenswrapper[4669]: I1001 11:37:20.610191 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-s7dqt"] Oct 01 11:37:21 crc kubenswrapper[4669]: I1001 11:37:21.657391 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" path="/var/lib/kubelet/pods/e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e/volumes" Oct 01 11:38:01 crc kubenswrapper[4669]: I1001 11:38:01.863364 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:38:01 crc kubenswrapper[4669]: I1001 11:38:01.864067 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:38:31 crc kubenswrapper[4669]: I1001 11:38:31.863968 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:38:31 crc kubenswrapper[4669]: I1001 11:38:31.864950 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:39:01 crc kubenswrapper[4669]: I1001 11:39:01.863412 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:39:01 crc kubenswrapper[4669]: I1001 11:39:01.864137 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:39:01 crc kubenswrapper[4669]: I1001 11:39:01.864197 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:39:01 crc kubenswrapper[4669]: I1001 11:39:01.864919 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"749997bec659c722d86e3b88621cbc0e2b2ce7eed205c06ca6b7b63eaf908655"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:39:01 crc kubenswrapper[4669]: I1001 11:39:01.864970 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://749997bec659c722d86e3b88621cbc0e2b2ce7eed205c06ca6b7b63eaf908655" gracePeriod=600 Oct 01 11:39:02 crc kubenswrapper[4669]: I1001 11:39:02.327965 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="749997bec659c722d86e3b88621cbc0e2b2ce7eed205c06ca6b7b63eaf908655" exitCode=0 Oct 01 11:39:02 crc kubenswrapper[4669]: I1001 11:39:02.328020 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"749997bec659c722d86e3b88621cbc0e2b2ce7eed205c06ca6b7b63eaf908655"} Oct 01 11:39:02 crc kubenswrapper[4669]: I1001 11:39:02.328661 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"86579f99b2d7fdefab555c5926d95a0899a74cade0993be4e08705b39fe0421d"} Oct 01 11:39:02 crc kubenswrapper[4669]: I1001 11:39:02.328697 4669 scope.go:117] "RemoveContainer" containerID="8027885e355d02196c881ebc15cce3dfddbca8c6fa333e055455ca80503be475" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.937754 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6kfl9"] Oct 01 11:39:09 crc kubenswrapper[4669]: E1001 11:39:09.938921 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" containerName="registry" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.938944 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" containerName="registry" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.939107 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09c7a9d-5a43-4d6b-ae13-4aef90a55f2e" containerName="registry" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.939780 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.940754 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vczt8"] Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.941672 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vczt8" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.944412 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.944407 4669 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qv7p5" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.944481 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.947614 4669 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-kbwkv" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.954933 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-j5v46"] Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.956065 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.957215 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6kfl9"] Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.959723 4669 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2bbhk" Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.978681 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-j5v46"] Oct 01 11:39:09 crc kubenswrapper[4669]: I1001 11:39:09.991841 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vczt8"] Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.109817 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slzlh\" (UniqueName: \"kubernetes.io/projected/8f212951-fc37-4759-8933-2cee5f94845e-kube-api-access-slzlh\") pod \"cert-manager-cainjector-7f985d654d-6kfl9\" (UID: \"8f212951-fc37-4759-8933-2cee5f94845e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.109905 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85gfd\" (UniqueName: \"kubernetes.io/projected/2c0929fd-88f7-47d4-9975-54d4d6c606c0-kube-api-access-85gfd\") pod \"cert-manager-5b446d88c5-vczt8\" (UID: \"2c0929fd-88f7-47d4-9975-54d4d6c606c0\") " pod="cert-manager/cert-manager-5b446d88c5-vczt8" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.109940 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cvzl\" (UniqueName: \"kubernetes.io/projected/bb59959e-c15d-466f-8809-66c2ae4c8a0b-kube-api-access-8cvzl\") pod \"cert-manager-webhook-5655c58dd6-j5v46\" (UID: \"bb59959e-c15d-466f-8809-66c2ae4c8a0b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.211468 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85gfd\" (UniqueName: \"kubernetes.io/projected/2c0929fd-88f7-47d4-9975-54d4d6c606c0-kube-api-access-85gfd\") pod \"cert-manager-5b446d88c5-vczt8\" (UID: \"2c0929fd-88f7-47d4-9975-54d4d6c606c0\") " pod="cert-manager/cert-manager-5b446d88c5-vczt8" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.211529 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cvzl\" (UniqueName: \"kubernetes.io/projected/bb59959e-c15d-466f-8809-66c2ae4c8a0b-kube-api-access-8cvzl\") pod \"cert-manager-webhook-5655c58dd6-j5v46\" (UID: \"bb59959e-c15d-466f-8809-66c2ae4c8a0b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.211650 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slzlh\" (UniqueName: \"kubernetes.io/projected/8f212951-fc37-4759-8933-2cee5f94845e-kube-api-access-slzlh\") pod \"cert-manager-cainjector-7f985d654d-6kfl9\" (UID: \"8f212951-fc37-4759-8933-2cee5f94845e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.240142 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slzlh\" (UniqueName: \"kubernetes.io/projected/8f212951-fc37-4759-8933-2cee5f94845e-kube-api-access-slzlh\") pod \"cert-manager-cainjector-7f985d654d-6kfl9\" (UID: \"8f212951-fc37-4759-8933-2cee5f94845e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.241450 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85gfd\" (UniqueName: \"kubernetes.io/projected/2c0929fd-88f7-47d4-9975-54d4d6c606c0-kube-api-access-85gfd\") pod \"cert-manager-5b446d88c5-vczt8\" (UID: \"2c0929fd-88f7-47d4-9975-54d4d6c606c0\") " pod="cert-manager/cert-manager-5b446d88c5-vczt8" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.243146 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cvzl\" (UniqueName: \"kubernetes.io/projected/bb59959e-c15d-466f-8809-66c2ae4c8a0b-kube-api-access-8cvzl\") pod \"cert-manager-webhook-5655c58dd6-j5v46\" (UID: \"bb59959e-c15d-466f-8809-66c2ae4c8a0b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.270214 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.287531 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vczt8" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.300964 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.606144 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6kfl9"] Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.626761 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-j5v46"] Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.628202 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:39:10 crc kubenswrapper[4669]: W1001 11:39:10.635170 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb59959e_c15d_466f_8809_66c2ae4c8a0b.slice/crio-0a69d5b0ce2c69c96d30ce5cfe13772e0c6bcc5f9d614b9506404625532e187f WatchSource:0}: Error finding container 0a69d5b0ce2c69c96d30ce5cfe13772e0c6bcc5f9d614b9506404625532e187f: Status 404 returned error can't find the container with id 0a69d5b0ce2c69c96d30ce5cfe13772e0c6bcc5f9d614b9506404625532e187f Oct 01 11:39:10 crc kubenswrapper[4669]: I1001 11:39:10.665060 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vczt8"] Oct 01 11:39:10 crc kubenswrapper[4669]: W1001 11:39:10.670464 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c0929fd_88f7_47d4_9975_54d4d6c606c0.slice/crio-e8404d5c94297a1e62fdff8c7729327b01a9cb9a2c0c353e5c0822947d2f923d WatchSource:0}: Error finding container e8404d5c94297a1e62fdff8c7729327b01a9cb9a2c0c353e5c0822947d2f923d: Status 404 returned error can't find the container with id e8404d5c94297a1e62fdff8c7729327b01a9cb9a2c0c353e5c0822947d2f923d Oct 01 11:39:11 crc kubenswrapper[4669]: I1001 11:39:11.472500 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vczt8" event={"ID":"2c0929fd-88f7-47d4-9975-54d4d6c606c0","Type":"ContainerStarted","Data":"e8404d5c94297a1e62fdff8c7729327b01a9cb9a2c0c353e5c0822947d2f923d"} Oct 01 11:39:11 crc kubenswrapper[4669]: I1001 11:39:11.474837 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" event={"ID":"bb59959e-c15d-466f-8809-66c2ae4c8a0b","Type":"ContainerStarted","Data":"0a69d5b0ce2c69c96d30ce5cfe13772e0c6bcc5f9d614b9506404625532e187f"} Oct 01 11:39:11 crc kubenswrapper[4669]: I1001 11:39:11.476398 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" event={"ID":"8f212951-fc37-4759-8933-2cee5f94845e","Type":"ContainerStarted","Data":"a1bad10d101fc072c6b0bdfcddf104c7d0d40d245aeed748a3c49c93ede2105d"} Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.505595 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vczt8" event={"ID":"2c0929fd-88f7-47d4-9975-54d4d6c606c0","Type":"ContainerStarted","Data":"df9cbce9166f33144ea5a35b79268066afb3e5436d5382f64eb7524d09711834"} Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.508020 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" event={"ID":"bb59959e-c15d-466f-8809-66c2ae4c8a0b","Type":"ContainerStarted","Data":"a9487f437f0c25249c830cc3c63243a296a35db57d681bea3d1383fe43bfbd39"} Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.508216 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.510152 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" event={"ID":"8f212951-fc37-4759-8933-2cee5f94845e","Type":"ContainerStarted","Data":"8a809fde6aaf2c8ece29318dd4a80a262d42696587f7a1008f4f1838643aed0c"} Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.529318 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-vczt8" podStartSLOduration=2.353613177 podStartE2EDuration="6.52928564s" podCreationTimestamp="2025-10-01 11:39:09 +0000 UTC" firstStartedPulling="2025-10-01 11:39:10.680929195 +0000 UTC m=+641.780494172" lastFinishedPulling="2025-10-01 11:39:14.856601658 +0000 UTC m=+645.956166635" observedRunningTime="2025-10-01 11:39:15.527057705 +0000 UTC m=+646.626622692" watchObservedRunningTime="2025-10-01 11:39:15.52928564 +0000 UTC m=+646.628850657" Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.553889 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-6kfl9" podStartSLOduration=2.447878346 podStartE2EDuration="6.553853944s" podCreationTimestamp="2025-10-01 11:39:09 +0000 UTC" firstStartedPulling="2025-10-01 11:39:10.627867229 +0000 UTC m=+641.727432206" lastFinishedPulling="2025-10-01 11:39:14.733842807 +0000 UTC m=+645.833407804" observedRunningTime="2025-10-01 11:39:15.548164024 +0000 UTC m=+646.647729061" watchObservedRunningTime="2025-10-01 11:39:15.553853944 +0000 UTC m=+646.653418961" Oct 01 11:39:15 crc kubenswrapper[4669]: I1001 11:39:15.571561 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" podStartSLOduration=2.4712738229999998 podStartE2EDuration="6.57153777s" podCreationTimestamp="2025-10-01 11:39:09 +0000 UTC" firstStartedPulling="2025-10-01 11:39:10.63766413 +0000 UTC m=+641.737229107" lastFinishedPulling="2025-10-01 11:39:14.737928077 +0000 UTC m=+645.837493054" observedRunningTime="2025-10-01 11:39:15.571287514 +0000 UTC m=+646.670852531" watchObservedRunningTime="2025-10-01 11:39:15.57153777 +0000 UTC m=+646.671102787" Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.305903 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-j5v46" Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.650246 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8kl5"] Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.650659 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-controller" containerID="cri-o://933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.651163 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="sbdb" containerID="cri-o://8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.651227 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="nbdb" containerID="cri-o://6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.651281 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="northd" containerID="cri-o://a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.651337 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.651411 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-node" containerID="cri-o://a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.651462 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-acl-logging" containerID="cri-o://22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" gracePeriod=30 Oct 01 11:39:20 crc kubenswrapper[4669]: I1001 11:39:20.694566 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" containerID="cri-o://314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" gracePeriod=30 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.013391 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/3.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.015951 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovn-acl-logging/0.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.016571 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovn-controller/0.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.017030 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076442 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-86wfm"] Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076717 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="northd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076734 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="northd" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076747 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076759 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076769 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kubecfg-setup" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076778 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kubecfg-setup" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076790 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="sbdb" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076798 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="sbdb" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076809 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076817 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076826 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076836 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076846 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076856 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076865 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-acl-logging" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076874 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-acl-logging" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076887 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-node" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076896 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-node" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076913 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="nbdb" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076921 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="nbdb" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.076938 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.076947 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077068 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077108 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077124 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="northd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077135 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="nbdb" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077151 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="sbdb" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077160 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077170 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077184 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovn-acl-logging" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077200 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077215 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="kube-rbac-proxy-node" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.077374 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077387 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077548 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077571 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.077761 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.077777 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5784d2-a874-4956-9d09-e923ac324925" containerName="ovnkube-controller" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.080542 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.183402 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-etc-openvswitch\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184292 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-netd\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184371 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-var-lib-openvswitch\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184419 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-node-log\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184462 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-ovn-kubernetes\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184509 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-netns\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184544 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-bin\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.183660 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184595 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-log-socket\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184641 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-log-socket" (OuterVolumeSpecName: "log-socket") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184660 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184705 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-node-log" (OuterVolumeSpecName: "node-log") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184747 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184846 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-slash\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184897 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5784d2-a874-4956-9d09-e923ac324925-ovn-node-metrics-cert\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184742 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184779 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184796 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184808 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184835 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185057 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184960 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-slash" (OuterVolumeSpecName: "host-slash") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.184987 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-ovn\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185204 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-script-lib\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185282 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-env-overrides\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-openvswitch\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185370 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45sfm\" (UniqueName: \"kubernetes.io/projected/6c5784d2-a874-4956-9d09-e923ac324925-kube-api-access-45sfm\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185430 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-systemd\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185455 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185474 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-kubelet\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185533 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-systemd-units\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.185575 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-config\") pod \"6c5784d2-a874-4956-9d09-e923ac324925\" (UID: \"6c5784d2-a874-4956-9d09-e923ac324925\") " Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186015 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovnkube-config\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186046 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186063 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnv9h\" (UniqueName: \"kubernetes.io/projected/b2932eb5-8aba-4c03-902a-ac251d6dae68-kube-api-access-fnv9h\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186140 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-run-netns\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186176 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-ovn\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186205 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186237 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovn-node-metrics-cert\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186309 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186445 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-log-socket\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186516 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-slash\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186584 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-cni-netd\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186623 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-run-ovn-kubernetes\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186653 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-cni-bin\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186684 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-node-log\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186754 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-systemd-units\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186788 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-systemd\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186821 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-etc-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186846 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-kubelet\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186885 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-var-lib-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.186983 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-env-overrides\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187056 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187160 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovnkube-script-lib\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187154 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187182 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187316 4669 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187376 4669 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187395 4669 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187408 4669 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187423 4669 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187440 4669 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187454 4669 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187467 4669 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187485 4669 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187499 4669 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187515 4669 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187528 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187542 4669 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187557 4669 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.187570 4669 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.193535 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5784d2-a874-4956-9d09-e923ac324925-kube-api-access-45sfm" (OuterVolumeSpecName: "kube-api-access-45sfm") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "kube-api-access-45sfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.194250 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5784d2-a874-4956-9d09-e923ac324925-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.202476 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6c5784d2-a874-4956-9d09-e923ac324925" (UID: "6c5784d2-a874-4956-9d09-e923ac324925"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.288936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-var-lib-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.288999 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289019 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-env-overrides\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289099 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovnkube-script-lib\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289132 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovnkube-config\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289148 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnv9h\" (UniqueName: \"kubernetes.io/projected/b2932eb5-8aba-4c03-902a-ac251d6dae68-kube-api-access-fnv9h\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289166 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-run-netns\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289186 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-ovn\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289209 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovn-node-metrics-cert\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289196 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-log-socket\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289228 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-log-socket\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289377 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-slash\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289433 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-cni-netd\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289377 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-ovn\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289485 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-run-ovn-kubernetes\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289502 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-cni-netd\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289529 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-cni-bin\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-slash\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289562 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-run-ovn-kubernetes\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289603 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-node-log\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289615 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289651 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-node-log\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289689 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-cni-bin\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289714 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-systemd-units\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289756 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-systemd-units\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289775 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-systemd\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289853 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-run-systemd\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289928 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-etc-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289982 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-kubelet\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.289990 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-var-lib-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290044 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-etc-openvswitch\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290010 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-run-netns\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290299 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2932eb5-8aba-4c03-902a-ac251d6dae68-host-kubelet\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290392 4669 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290420 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45sfm\" (UniqueName: \"kubernetes.io/projected/6c5784d2-a874-4956-9d09-e923ac324925-kube-api-access-45sfm\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290440 4669 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6c5784d2-a874-4956-9d09-e923ac324925-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290457 4669 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c5784d2-a874-4956-9d09-e923ac324925-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.290478 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c5784d2-a874-4956-9d09-e923ac324925-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.291950 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-env-overrides\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.292071 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovnkube-config\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.292181 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovnkube-script-lib\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.293624 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b2932eb5-8aba-4c03-902a-ac251d6dae68-ovn-node-metrics-cert\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.312368 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnv9h\" (UniqueName: \"kubernetes.io/projected/b2932eb5-8aba-4c03-902a-ac251d6dae68-kube-api-access-fnv9h\") pod \"ovnkube-node-86wfm\" (UID: \"b2932eb5-8aba-4c03-902a-ac251d6dae68\") " pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.411275 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.556884 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovnkube-controller/3.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.561778 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovn-acl-logging/0.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563145 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8kl5_6c5784d2-a874-4956-9d09-e923ac324925/ovn-controller/0.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563720 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" exitCode=0 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563781 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" exitCode=0 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563792 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" exitCode=0 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563801 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" exitCode=0 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563809 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" exitCode=0 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563816 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" exitCode=0 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563823 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" exitCode=143 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563830 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c5784d2-a874-4956-9d09-e923ac324925" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" exitCode=143 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563888 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563922 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563938 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563949 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563961 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563972 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.563987 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564001 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564010 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564016 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564022 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564027 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564033 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564039 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564044 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564051 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564060 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564066 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564090 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564096 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564101 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564107 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564112 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564117 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564123 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564129 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564137 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564146 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564153 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564158 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564164 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564171 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564176 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564182 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564187 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564194 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564200 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564210 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" event={"ID":"6c5784d2-a874-4956-9d09-e923ac324925","Type":"ContainerDied","Data":"cbdf041ce88efe1f02c8d5fd5d3255c519a7efa5d979db042de2c0fcc6791d0e"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564219 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564226 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564232 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564238 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564246 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564253 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564259 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564264 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564270 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564276 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564292 4669 scope.go:117] "RemoveContainer" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.564491 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8kl5" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.566855 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"5b6c8d9f8be5390be5a230430afcf6bda3579d6eda0d331ed75f4eba501ce6b6"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.571173 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/2.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.571751 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/1.log" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.571844 4669 generic.go:334] "Generic (PLEG): container finished" podID="238b8e33-ca8b-419a-b038-329ab97a3843" containerID="2b92a3e428a83d8c9dfc08f32f569a5b6ad6841717ca649606e0ea74c98b3996" exitCode=2 Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.571951 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerDied","Data":"2b92a3e428a83d8c9dfc08f32f569a5b6ad6841717ca649606e0ea74c98b3996"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.572022 4669 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795"} Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.572769 4669 scope.go:117] "RemoveContainer" containerID="2b92a3e428a83d8c9dfc08f32f569a5b6ad6841717ca649606e0ea74c98b3996" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.572993 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9kgdm_openshift-multus(238b8e33-ca8b-419a-b038-329ab97a3843)\"" pod="openshift-multus/multus-9kgdm" podUID="238b8e33-ca8b-419a-b038-329ab97a3843" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.581722 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.647891 4669 scope.go:117] "RemoveContainer" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.663595 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8kl5"] Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.664417 4669 scope.go:117] "RemoveContainer" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.669398 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8kl5"] Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.681118 4669 scope.go:117] "RemoveContainer" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.695177 4669 scope.go:117] "RemoveContainer" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.709449 4669 scope.go:117] "RemoveContainer" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.722003 4669 scope.go:117] "RemoveContainer" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.737413 4669 scope.go:117] "RemoveContainer" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.753544 4669 scope.go:117] "RemoveContainer" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.775577 4669 scope.go:117] "RemoveContainer" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.776304 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": container with ID starting with 314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4 not found: ID does not exist" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.776387 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} err="failed to get container status \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": rpc error: code = NotFound desc = could not find container \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": container with ID starting with 314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.776434 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.776831 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": container with ID starting with a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0 not found: ID does not exist" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.776867 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} err="failed to get container status \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": rpc error: code = NotFound desc = could not find container \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": container with ID starting with a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.776884 4669 scope.go:117] "RemoveContainer" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.777164 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": container with ID starting with 8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c not found: ID does not exist" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.777194 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} err="failed to get container status \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": rpc error: code = NotFound desc = could not find container \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": container with ID starting with 8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.777222 4669 scope.go:117] "RemoveContainer" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.777519 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": container with ID starting with 6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe not found: ID does not exist" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.777550 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} err="failed to get container status \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": rpc error: code = NotFound desc = could not find container \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": container with ID starting with 6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.777570 4669 scope.go:117] "RemoveContainer" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.777848 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": container with ID starting with a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2 not found: ID does not exist" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.777876 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} err="failed to get container status \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": rpc error: code = NotFound desc = could not find container \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": container with ID starting with a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.777926 4669 scope.go:117] "RemoveContainer" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.778187 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": container with ID starting with acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd not found: ID does not exist" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778207 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} err="failed to get container status \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": rpc error: code = NotFound desc = could not find container \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": container with ID starting with acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778225 4669 scope.go:117] "RemoveContainer" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.778449 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": container with ID starting with a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380 not found: ID does not exist" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778479 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} err="failed to get container status \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": rpc error: code = NotFound desc = could not find container \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": container with ID starting with a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778495 4669 scope.go:117] "RemoveContainer" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.778725 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": container with ID starting with 22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257 not found: ID does not exist" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778745 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} err="failed to get container status \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": rpc error: code = NotFound desc = could not find container \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": container with ID starting with 22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778760 4669 scope.go:117] "RemoveContainer" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.778947 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": container with ID starting with 933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c not found: ID does not exist" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778967 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} err="failed to get container status \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": rpc error: code = NotFound desc = could not find container \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": container with ID starting with 933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.778981 4669 scope.go:117] "RemoveContainer" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" Oct 01 11:39:21 crc kubenswrapper[4669]: E1001 11:39:21.779252 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": container with ID starting with 981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0 not found: ID does not exist" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.779282 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} err="failed to get container status \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": rpc error: code = NotFound desc = could not find container \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": container with ID starting with 981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.779302 4669 scope.go:117] "RemoveContainer" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.779762 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} err="failed to get container status \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": rpc error: code = NotFound desc = could not find container \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": container with ID starting with 314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.779840 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.780299 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} err="failed to get container status \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": rpc error: code = NotFound desc = could not find container \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": container with ID starting with a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.780329 4669 scope.go:117] "RemoveContainer" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.780815 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} err="failed to get container status \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": rpc error: code = NotFound desc = could not find container \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": container with ID starting with 8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.780860 4669 scope.go:117] "RemoveContainer" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.781532 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} err="failed to get container status \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": rpc error: code = NotFound desc = could not find container \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": container with ID starting with 6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.781564 4669 scope.go:117] "RemoveContainer" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.782794 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} err="failed to get container status \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": rpc error: code = NotFound desc = could not find container \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": container with ID starting with a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.782895 4669 scope.go:117] "RemoveContainer" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.783577 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} err="failed to get container status \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": rpc error: code = NotFound desc = could not find container \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": container with ID starting with acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.783628 4669 scope.go:117] "RemoveContainer" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.784015 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} err="failed to get container status \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": rpc error: code = NotFound desc = could not find container \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": container with ID starting with a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.784050 4669 scope.go:117] "RemoveContainer" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.784357 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} err="failed to get container status \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": rpc error: code = NotFound desc = could not find container \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": container with ID starting with 22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.784377 4669 scope.go:117] "RemoveContainer" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.784695 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} err="failed to get container status \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": rpc error: code = NotFound desc = could not find container \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": container with ID starting with 933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.784722 4669 scope.go:117] "RemoveContainer" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.785005 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} err="failed to get container status \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": rpc error: code = NotFound desc = could not find container \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": container with ID starting with 981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.785025 4669 scope.go:117] "RemoveContainer" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.785318 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} err="failed to get container status \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": rpc error: code = NotFound desc = could not find container \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": container with ID starting with 314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.785343 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.785822 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} err="failed to get container status \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": rpc error: code = NotFound desc = could not find container \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": container with ID starting with a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.785847 4669 scope.go:117] "RemoveContainer" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.786509 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} err="failed to get container status \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": rpc error: code = NotFound desc = could not find container \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": container with ID starting with 8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.786557 4669 scope.go:117] "RemoveContainer" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.787054 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} err="failed to get container status \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": rpc error: code = NotFound desc = could not find container \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": container with ID starting with 6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.787111 4669 scope.go:117] "RemoveContainer" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.787538 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} err="failed to get container status \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": rpc error: code = NotFound desc = could not find container \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": container with ID starting with a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.787567 4669 scope.go:117] "RemoveContainer" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.787972 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} err="failed to get container status \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": rpc error: code = NotFound desc = could not find container \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": container with ID starting with acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.788019 4669 scope.go:117] "RemoveContainer" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.788452 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} err="failed to get container status \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": rpc error: code = NotFound desc = could not find container \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": container with ID starting with a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.788482 4669 scope.go:117] "RemoveContainer" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.789378 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} err="failed to get container status \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": rpc error: code = NotFound desc = could not find container \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": container with ID starting with 22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.789440 4669 scope.go:117] "RemoveContainer" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.789815 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} err="failed to get container status \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": rpc error: code = NotFound desc = could not find container \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": container with ID starting with 933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.789845 4669 scope.go:117] "RemoveContainer" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.790311 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} err="failed to get container status \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": rpc error: code = NotFound desc = could not find container \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": container with ID starting with 981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.790358 4669 scope.go:117] "RemoveContainer" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.790847 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} err="failed to get container status \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": rpc error: code = NotFound desc = could not find container \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": container with ID starting with 314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.790873 4669 scope.go:117] "RemoveContainer" containerID="a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.791249 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0"} err="failed to get container status \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": rpc error: code = NotFound desc = could not find container \"a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0\": container with ID starting with a48f1060b815aa4e50174786077a72df467ff0832fa2dfca1ece9bf65155afc0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.791297 4669 scope.go:117] "RemoveContainer" containerID="8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.793939 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c"} err="failed to get container status \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": rpc error: code = NotFound desc = could not find container \"8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c\": container with ID starting with 8aac59993558a5e0cc4a88500e485cee02b553be11d46e2958ffa786ed42218c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.793976 4669 scope.go:117] "RemoveContainer" containerID="6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.794404 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe"} err="failed to get container status \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": rpc error: code = NotFound desc = could not find container \"6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe\": container with ID starting with 6744ca2a342eb147fadace3f548b404a2f5a31410cc692bd14e7cbcf219bcabe not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.794466 4669 scope.go:117] "RemoveContainer" containerID="a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.795286 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2"} err="failed to get container status \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": rpc error: code = NotFound desc = could not find container \"a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2\": container with ID starting with a88952ba59dfd1d1f6ed8da771d63b0813e13282b465d571f93f262f9d4c43e2 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.795315 4669 scope.go:117] "RemoveContainer" containerID="acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.795736 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd"} err="failed to get container status \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": rpc error: code = NotFound desc = could not find container \"acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd\": container with ID starting with acd6428a9168f82c8f35eac5011da02cf1bced32d1d6e7d34fb1396a8287d7cd not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.795788 4669 scope.go:117] "RemoveContainer" containerID="a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.796170 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380"} err="failed to get container status \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": rpc error: code = NotFound desc = could not find container \"a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380\": container with ID starting with a8a1cd0473a98d879a229787fbe6fbb71d862659e96c0d93a9e6c6c736c86380 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.796223 4669 scope.go:117] "RemoveContainer" containerID="22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.796516 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257"} err="failed to get container status \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": rpc error: code = NotFound desc = could not find container \"22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257\": container with ID starting with 22907f5e440fd4c01016b76568d8ae6e401d196b65fe1103f3e13ce778374257 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.796559 4669 scope.go:117] "RemoveContainer" containerID="933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.796844 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c"} err="failed to get container status \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": rpc error: code = NotFound desc = could not find container \"933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c\": container with ID starting with 933d409b3a8e4031301b54f891fbf8c8aca9f1ad6258fbd27d5b0c25a32af54c not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.796873 4669 scope.go:117] "RemoveContainer" containerID="981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.797226 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0"} err="failed to get container status \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": rpc error: code = NotFound desc = could not find container \"981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0\": container with ID starting with 981ce3582c09663f7219ca8bebb73bfb3e30c937e346c2dca9d6b113157852d0 not found: ID does not exist" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.797291 4669 scope.go:117] "RemoveContainer" containerID="314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4" Oct 01 11:39:21 crc kubenswrapper[4669]: I1001 11:39:21.797640 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4"} err="failed to get container status \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": rpc error: code = NotFound desc = could not find container \"314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4\": container with ID starting with 314c5037959f994660c56e92fcf5e2b03e84d2877a275f8516af4e02bfc2bfe4 not found: ID does not exist" Oct 01 11:39:22 crc kubenswrapper[4669]: I1001 11:39:22.581795 4669 generic.go:334] "Generic (PLEG): container finished" podID="b2932eb5-8aba-4c03-902a-ac251d6dae68" containerID="6c3afca070c1695e69c5ea902d95e5c4baac7e73585bb4456bf1e8b8ae0429ea" exitCode=0 Oct 01 11:39:22 crc kubenswrapper[4669]: I1001 11:39:22.581995 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerDied","Data":"6c3afca070c1695e69c5ea902d95e5c4baac7e73585bb4456bf1e8b8ae0429ea"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.597187 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"abfec36cf4f9f894a84441168b543a88854ff4fae70e32a191cb8585a81c985f"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.598213 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"f01f21f42399afd9f5200ef4ad963f4c2ed0bf2c00134bc829dbe36ab311d8f8"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.598244 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"fb12f9b38a936627c694ad5382d70a3e3faf41bc8fff436a188588ed8ab62bbc"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.598259 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"a04a5d18947d316713cdcbffe6c1d4097affcabc7afb6d97cace9d1f48992553"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.598271 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"8555a3ad4d7304c13b4ad506618046ca996031ddc040e647e6fc635c93b529cb"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.598285 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"e6e7319b2cddc72df6d888b3367957c21310f28645a27956c4f5269987fe7896"} Oct 01 11:39:23 crc kubenswrapper[4669]: I1001 11:39:23.653306 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5784d2-a874-4956-9d09-e923ac324925" path="/var/lib/kubelet/pods/6c5784d2-a874-4956-9d09-e923ac324925/volumes" Oct 01 11:39:26 crc kubenswrapper[4669]: I1001 11:39:26.627696 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"a0f33e5c9cef85df556fac87ed6a6fe44b8b164b3e7a615dea1025a360f41c4c"} Oct 01 11:39:28 crc kubenswrapper[4669]: I1001 11:39:28.646838 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" event={"ID":"b2932eb5-8aba-4c03-902a-ac251d6dae68","Type":"ContainerStarted","Data":"6b556a3fa15c19603bb7dcf9e93ed749894ab7a5287bd077cbad1378d1ff9b3f"} Oct 01 11:39:28 crc kubenswrapper[4669]: I1001 11:39:28.682730 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" podStartSLOduration=7.682710107 podStartE2EDuration="7.682710107s" podCreationTimestamp="2025-10-01 11:39:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:39:28.67998778 +0000 UTC m=+659.779552777" watchObservedRunningTime="2025-10-01 11:39:28.682710107 +0000 UTC m=+659.782275094" Oct 01 11:39:29 crc kubenswrapper[4669]: I1001 11:39:29.660956 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:29 crc kubenswrapper[4669]: I1001 11:39:29.661022 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:29 crc kubenswrapper[4669]: I1001 11:39:29.661043 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:29 crc kubenswrapper[4669]: I1001 11:39:29.703625 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:29 crc kubenswrapper[4669]: I1001 11:39:29.704230 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:39:30 crc kubenswrapper[4669]: I1001 11:39:30.201989 4669 scope.go:117] "RemoveContainer" containerID="7d7586480d3f426660b49079e4aff1fc141c6daa68f43e313f58480faeb87795" Oct 01 11:39:30 crc kubenswrapper[4669]: I1001 11:39:30.665025 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/2.log" Oct 01 11:39:32 crc kubenswrapper[4669]: I1001 11:39:32.644967 4669 scope.go:117] "RemoveContainer" containerID="2b92a3e428a83d8c9dfc08f32f569a5b6ad6841717ca649606e0ea74c98b3996" Oct 01 11:39:32 crc kubenswrapper[4669]: E1001 11:39:32.645387 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9kgdm_openshift-multus(238b8e33-ca8b-419a-b038-329ab97a3843)\"" pod="openshift-multus/multus-9kgdm" podUID="238b8e33-ca8b-419a-b038-329ab97a3843" Oct 01 11:39:46 crc kubenswrapper[4669]: I1001 11:39:46.645461 4669 scope.go:117] "RemoveContainer" containerID="2b92a3e428a83d8c9dfc08f32f569a5b6ad6841717ca649606e0ea74c98b3996" Oct 01 11:39:47 crc kubenswrapper[4669]: I1001 11:39:47.800394 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9kgdm_238b8e33-ca8b-419a-b038-329ab97a3843/kube-multus/2.log" Oct 01 11:39:47 crc kubenswrapper[4669]: I1001 11:39:47.801324 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9kgdm" event={"ID":"238b8e33-ca8b-419a-b038-329ab97a3843","Type":"ContainerStarted","Data":"7dcc91591bf48526b226fa3787458a4f2919798cb96c2eef76c7a358505d0c8f"} Oct 01 11:39:51 crc kubenswrapper[4669]: I1001 11:39:51.450144 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-86wfm" Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.823647 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9"] Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.825698 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.829413 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.842514 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9"] Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.924576 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.924681 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5z69\" (UniqueName: \"kubernetes.io/projected/57b1aea1-6b22-4512-b88f-bafc19415c87-kube-api-access-c5z69\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:02 crc kubenswrapper[4669]: I1001 11:40:02.924766 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.025893 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5z69\" (UniqueName: \"kubernetes.io/projected/57b1aea1-6b22-4512-b88f-bafc19415c87-kube-api-access-c5z69\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.025970 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.026045 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.026674 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.026934 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.055516 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5z69\" (UniqueName: \"kubernetes.io/projected/57b1aea1-6b22-4512-b88f-bafc19415c87-kube-api-access-c5z69\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.144811 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.690762 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9"] Oct 01 11:40:03 crc kubenswrapper[4669]: W1001 11:40:03.704173 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b1aea1_6b22_4512_b88f_bafc19415c87.slice/crio-99144e8c4cb0d10ca0ce40fb3bbad8f46d1313e3e764e82cc93c6d4b311bddf4 WatchSource:0}: Error finding container 99144e8c4cb0d10ca0ce40fb3bbad8f46d1313e3e764e82cc93c6d4b311bddf4: Status 404 returned error can't find the container with id 99144e8c4cb0d10ca0ce40fb3bbad8f46d1313e3e764e82cc93c6d4b311bddf4 Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.916796 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" event={"ID":"57b1aea1-6b22-4512-b88f-bafc19415c87","Type":"ContainerStarted","Data":"6d32365c04f49ba4b31fa32d4487f929cdeacfacf93c17fc7354094b7de81806"} Oct 01 11:40:03 crc kubenswrapper[4669]: I1001 11:40:03.916878 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" event={"ID":"57b1aea1-6b22-4512-b88f-bafc19415c87","Type":"ContainerStarted","Data":"99144e8c4cb0d10ca0ce40fb3bbad8f46d1313e3e764e82cc93c6d4b311bddf4"} Oct 01 11:40:04 crc kubenswrapper[4669]: I1001 11:40:04.926808 4669 generic.go:334] "Generic (PLEG): container finished" podID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerID="6d32365c04f49ba4b31fa32d4487f929cdeacfacf93c17fc7354094b7de81806" exitCode=0 Oct 01 11:40:04 crc kubenswrapper[4669]: I1001 11:40:04.926889 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" event={"ID":"57b1aea1-6b22-4512-b88f-bafc19415c87","Type":"ContainerDied","Data":"6d32365c04f49ba4b31fa32d4487f929cdeacfacf93c17fc7354094b7de81806"} Oct 01 11:40:06 crc kubenswrapper[4669]: I1001 11:40:06.944965 4669 generic.go:334] "Generic (PLEG): container finished" podID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerID="42d95caf3e2c340541dc069c089bb9a00ba54d714872a54aa1ebcfc8d0bc66f0" exitCode=0 Oct 01 11:40:06 crc kubenswrapper[4669]: I1001 11:40:06.945384 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" event={"ID":"57b1aea1-6b22-4512-b88f-bafc19415c87","Type":"ContainerDied","Data":"42d95caf3e2c340541dc069c089bb9a00ba54d714872a54aa1ebcfc8d0bc66f0"} Oct 01 11:40:07 crc kubenswrapper[4669]: I1001 11:40:07.956127 4669 generic.go:334] "Generic (PLEG): container finished" podID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerID="1212aead25c801375a8dfb0c8f2d635dc0ad03f7ba0ba3ee4191070b1b47bc52" exitCode=0 Oct 01 11:40:07 crc kubenswrapper[4669]: I1001 11:40:07.956195 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" event={"ID":"57b1aea1-6b22-4512-b88f-bafc19415c87","Type":"ContainerDied","Data":"1212aead25c801375a8dfb0c8f2d635dc0ad03f7ba0ba3ee4191070b1b47bc52"} Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.233353 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.330853 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-bundle\") pod \"57b1aea1-6b22-4512-b88f-bafc19415c87\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.331335 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5z69\" (UniqueName: \"kubernetes.io/projected/57b1aea1-6b22-4512-b88f-bafc19415c87-kube-api-access-c5z69\") pod \"57b1aea1-6b22-4512-b88f-bafc19415c87\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.331379 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-util\") pod \"57b1aea1-6b22-4512-b88f-bafc19415c87\" (UID: \"57b1aea1-6b22-4512-b88f-bafc19415c87\") " Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.332818 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-bundle" (OuterVolumeSpecName: "bundle") pod "57b1aea1-6b22-4512-b88f-bafc19415c87" (UID: "57b1aea1-6b22-4512-b88f-bafc19415c87"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.342002 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b1aea1-6b22-4512-b88f-bafc19415c87-kube-api-access-c5z69" (OuterVolumeSpecName: "kube-api-access-c5z69") pod "57b1aea1-6b22-4512-b88f-bafc19415c87" (UID: "57b1aea1-6b22-4512-b88f-bafc19415c87"). InnerVolumeSpecName "kube-api-access-c5z69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.433387 4669 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.433460 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5z69\" (UniqueName: \"kubernetes.io/projected/57b1aea1-6b22-4512-b88f-bafc19415c87-kube-api-access-c5z69\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.752823 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-util" (OuterVolumeSpecName: "util") pod "57b1aea1-6b22-4512-b88f-bafc19415c87" (UID: "57b1aea1-6b22-4512-b88f-bafc19415c87"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.839414 4669 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57b1aea1-6b22-4512-b88f-bafc19415c87-util\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.972574 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" event={"ID":"57b1aea1-6b22-4512-b88f-bafc19415c87","Type":"ContainerDied","Data":"99144e8c4cb0d10ca0ce40fb3bbad8f46d1313e3e764e82cc93c6d4b311bddf4"} Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.972644 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99144e8c4cb0d10ca0ce40fb3bbad8f46d1313e3e764e82cc93c6d4b311bddf4" Oct 01 11:40:09 crc kubenswrapper[4669]: I1001 11:40:09.972677 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.491895 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r"] Oct 01 11:40:11 crc kubenswrapper[4669]: E1001 11:40:11.492220 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="extract" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.492240 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="extract" Oct 01 11:40:11 crc kubenswrapper[4669]: E1001 11:40:11.492263 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="util" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.492272 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="util" Oct 01 11:40:11 crc kubenswrapper[4669]: E1001 11:40:11.492284 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="pull" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.492292 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="pull" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.492412 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b1aea1-6b22-4512-b88f-bafc19415c87" containerName="extract" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.492917 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.495218 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xzv6g" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.498346 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.504564 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.514522 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r"] Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.563268 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvz7t\" (UniqueName: \"kubernetes.io/projected/d776bb0e-3c68-4273-8aa2-e17ce4299e0c-kube-api-access-xvz7t\") pod \"nmstate-operator-5d6f6cfd66-t4r7r\" (UID: \"d776bb0e-3c68-4273-8aa2-e17ce4299e0c\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.665199 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvz7t\" (UniqueName: \"kubernetes.io/projected/d776bb0e-3c68-4273-8aa2-e17ce4299e0c-kube-api-access-xvz7t\") pod \"nmstate-operator-5d6f6cfd66-t4r7r\" (UID: \"d776bb0e-3c68-4273-8aa2-e17ce4299e0c\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.686665 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvz7t\" (UniqueName: \"kubernetes.io/projected/d776bb0e-3c68-4273-8aa2-e17ce4299e0c-kube-api-access-xvz7t\") pod \"nmstate-operator-5d6f6cfd66-t4r7r\" (UID: \"d776bb0e-3c68-4273-8aa2-e17ce4299e0c\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" Oct 01 11:40:11 crc kubenswrapper[4669]: I1001 11:40:11.812590 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" Oct 01 11:40:12 crc kubenswrapper[4669]: I1001 11:40:12.329270 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r"] Oct 01 11:40:12 crc kubenswrapper[4669]: W1001 11:40:12.346458 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd776bb0e_3c68_4273_8aa2_e17ce4299e0c.slice/crio-f81682ccc76ca93ca4c6c2d5ec329638fe557369f8b200c4fe1c748d41d1766f WatchSource:0}: Error finding container f81682ccc76ca93ca4c6c2d5ec329638fe557369f8b200c4fe1c748d41d1766f: Status 404 returned error can't find the container with id f81682ccc76ca93ca4c6c2d5ec329638fe557369f8b200c4fe1c748d41d1766f Oct 01 11:40:12 crc kubenswrapper[4669]: I1001 11:40:12.994436 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" event={"ID":"d776bb0e-3c68-4273-8aa2-e17ce4299e0c","Type":"ContainerStarted","Data":"f81682ccc76ca93ca4c6c2d5ec329638fe557369f8b200c4fe1c748d41d1766f"} Oct 01 11:40:15 crc kubenswrapper[4669]: I1001 11:40:15.007900 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" event={"ID":"d776bb0e-3c68-4273-8aa2-e17ce4299e0c","Type":"ContainerStarted","Data":"fc2b8054434a8de9ebb19c0c4ba6a112acde3c7bcb25fdbf33040d957276dfd4"} Oct 01 11:40:15 crc kubenswrapper[4669]: I1001 11:40:15.032911 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t4r7r" podStartSLOduration=1.840580792 podStartE2EDuration="4.032873565s" podCreationTimestamp="2025-10-01 11:40:11 +0000 UTC" firstStartedPulling="2025-10-01 11:40:12.349964044 +0000 UTC m=+703.449529031" lastFinishedPulling="2025-10-01 11:40:14.542256817 +0000 UTC m=+705.641821804" observedRunningTime="2025-10-01 11:40:15.03103395 +0000 UTC m=+706.130598967" watchObservedRunningTime="2025-10-01 11:40:15.032873565 +0000 UTC m=+706.132438582" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.111750 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-87fqs"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.113391 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.116486 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5kt6s" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.135607 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-vbw79"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.136687 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.140384 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.158945 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-vbw79"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.182715 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-87fqs"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.196111 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8p9bl"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.197104 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.255954 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqcw\" (UniqueName: \"kubernetes.io/projected/10471e2d-ad87-44b7-af2e-b2209ae9337e-kube-api-access-glqcw\") pod \"nmstate-metrics-58fcddf996-87fqs\" (UID: \"10471e2d-ad87-44b7-af2e-b2209ae9337e\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.256062 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a99a9fe-0aaa-496b-97f2-e0964378b735-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.256127 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgjh9\" (UniqueName: \"kubernetes.io/projected/4a99a9fe-0aaa-496b-97f2-e0964378b735-kube-api-access-fgjh9\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.284263 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.285236 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.287665 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.287941 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gqlkx" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.287990 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.295663 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.356931 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-ovs-socket\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.357011 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqcw\" (UniqueName: \"kubernetes.io/projected/10471e2d-ad87-44b7-af2e-b2209ae9337e-kube-api-access-glqcw\") pod \"nmstate-metrics-58fcddf996-87fqs\" (UID: \"10471e2d-ad87-44b7-af2e-b2209ae9337e\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.357215 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-nmstate-lock\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.357291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48x4r\" (UniqueName: \"kubernetes.io/projected/a2c1c01f-82d8-48e3-a140-14f363594918-kube-api-access-48x4r\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.357340 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-dbus-socket\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.357508 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a99a9fe-0aaa-496b-97f2-e0964378b735-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.357565 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgjh9\" (UniqueName: \"kubernetes.io/projected/4a99a9fe-0aaa-496b-97f2-e0964378b735-kube-api-access-fgjh9\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: E1001 11:40:16.357624 4669 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 01 11:40:16 crc kubenswrapper[4669]: E1001 11:40:16.357687 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a99a9fe-0aaa-496b-97f2-e0964378b735-tls-key-pair podName:4a99a9fe-0aaa-496b-97f2-e0964378b735 nodeName:}" failed. No retries permitted until 2025-10-01 11:40:16.85766035 +0000 UTC m=+707.957225327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/4a99a9fe-0aaa-496b-97f2-e0964378b735-tls-key-pair") pod "nmstate-webhook-6d689559c5-vbw79" (UID: "4a99a9fe-0aaa-496b-97f2-e0964378b735") : secret "openshift-nmstate-webhook" not found Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.388905 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqcw\" (UniqueName: \"kubernetes.io/projected/10471e2d-ad87-44b7-af2e-b2209ae9337e-kube-api-access-glqcw\") pod \"nmstate-metrics-58fcddf996-87fqs\" (UID: \"10471e2d-ad87-44b7-af2e-b2209ae9337e\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.399942 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgjh9\" (UniqueName: \"kubernetes.io/projected/4a99a9fe-0aaa-496b-97f2-e0964378b735-kube-api-access-fgjh9\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.446293 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.461933 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-ovs-socket\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462002 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-nmstate-lock\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462028 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48x4r\" (UniqueName: \"kubernetes.io/projected/a2c1c01f-82d8-48e3-a140-14f363594918-kube-api-access-48x4r\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462065 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8gw5\" (UniqueName: \"kubernetes.io/projected/39594755-e0c6-4941-ac5c-b847a32459ff-kube-api-access-c8gw5\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-dbus-socket\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462148 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/39594755-e0c6-4941-ac5c-b847a32459ff-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/39594755-e0c6-4941-ac5c-b847a32459ff-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462320 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-ovs-socket\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462359 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-nmstate-lock\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.462943 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a2c1c01f-82d8-48e3-a140-14f363594918-dbus-socket\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.484671 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48x4r\" (UniqueName: \"kubernetes.io/projected/a2c1c01f-82d8-48e3-a140-14f363594918-kube-api-access-48x4r\") pod \"nmstate-handler-8p9bl\" (UID: \"a2c1c01f-82d8-48e3-a140-14f363594918\") " pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.517480 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.563767 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8gw5\" (UniqueName: \"kubernetes.io/projected/39594755-e0c6-4941-ac5c-b847a32459ff-kube-api-access-c8gw5\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.563819 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/39594755-e0c6-4941-ac5c-b847a32459ff-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.563886 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/39594755-e0c6-4941-ac5c-b847a32459ff-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.568866 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/39594755-e0c6-4941-ac5c-b847a32459ff-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.570424 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/39594755-e0c6-4941-ac5c-b847a32459ff-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.592773 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8gw5\" (UniqueName: \"kubernetes.io/projected/39594755-e0c6-4941-ac5c-b847a32459ff-kube-api-access-c8gw5\") pod \"nmstate-console-plugin-864bb6dfb5-wlrv2\" (UID: \"39594755-e0c6-4941-ac5c-b847a32459ff\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.601589 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.631531 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bd7976d-jdmjm"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.634982 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.654845 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bd7976d-jdmjm"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.768476 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/6264b534-3b2b-44c5-8c01-2ae164ece77e-kube-api-access-555cd\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.768559 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-service-ca\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.768618 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-oauth-serving-cert\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.768779 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-trusted-ca-bundle\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.768998 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-oauth-config\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.769033 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-serving-cert\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.769070 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-config\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.777465 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-87fqs"] Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.853963 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2"] Oct 01 11:40:16 crc kubenswrapper[4669]: W1001 11:40:16.860671 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39594755_e0c6_4941_ac5c_b847a32459ff.slice/crio-b15620cf342484ab966f6be443a213308909d5b56acbbe85d5421ca93da7177a WatchSource:0}: Error finding container b15620cf342484ab966f6be443a213308909d5b56acbbe85d5421ca93da7177a: Status 404 returned error can't find the container with id b15620cf342484ab966f6be443a213308909d5b56acbbe85d5421ca93da7177a Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870710 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-trusted-ca-bundle\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870758 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-oauth-config\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870781 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-serving-cert\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870800 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-config\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870823 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/6264b534-3b2b-44c5-8c01-2ae164ece77e-kube-api-access-555cd\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870884 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-service-ca\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870915 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a99a9fe-0aaa-496b-97f2-e0964378b735-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.870941 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-oauth-serving-cert\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.871882 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-oauth-serving-cert\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.872806 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-config\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.872923 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-trusted-ca-bundle\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.873840 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6264b534-3b2b-44c5-8c01-2ae164ece77e-service-ca\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.877327 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-oauth-config\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.878011 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a99a9fe-0aaa-496b-97f2-e0964378b735-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vbw79\" (UID: \"4a99a9fe-0aaa-496b-97f2-e0964378b735\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.878498 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6264b534-3b2b-44c5-8c01-2ae164ece77e-console-serving-cert\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.888299 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/6264b534-3b2b-44c5-8c01-2ae164ece77e-kube-api-access-555cd\") pod \"console-bd7976d-jdmjm\" (UID: \"6264b534-3b2b-44c5-8c01-2ae164ece77e\") " pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:16 crc kubenswrapper[4669]: I1001 11:40:16.995853 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:17 crc kubenswrapper[4669]: I1001 11:40:17.023192 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" event={"ID":"10471e2d-ad87-44b7-af2e-b2209ae9337e","Type":"ContainerStarted","Data":"4a9619af97d54b36467c2257d15fb9cff9953905f5f0349dd813859f0fb110f1"} Oct 01 11:40:17 crc kubenswrapper[4669]: I1001 11:40:17.024296 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" event={"ID":"39594755-e0c6-4941-ac5c-b847a32459ff","Type":"ContainerStarted","Data":"b15620cf342484ab966f6be443a213308909d5b56acbbe85d5421ca93da7177a"} Oct 01 11:40:17 crc kubenswrapper[4669]: I1001 11:40:17.025062 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8p9bl" event={"ID":"a2c1c01f-82d8-48e3-a140-14f363594918","Type":"ContainerStarted","Data":"f6063cc86ab3bfd99b28901c5069ccf37b9bda9a3115ec9869040fde60a9b380"} Oct 01 11:40:17 crc kubenswrapper[4669]: I1001 11:40:17.068811 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:17 crc kubenswrapper[4669]: I1001 11:40:17.334862 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-vbw79"] Oct 01 11:40:17 crc kubenswrapper[4669]: I1001 11:40:17.404511 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bd7976d-jdmjm"] Oct 01 11:40:18 crc kubenswrapper[4669]: I1001 11:40:18.036894 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bd7976d-jdmjm" event={"ID":"6264b534-3b2b-44c5-8c01-2ae164ece77e","Type":"ContainerStarted","Data":"9a482530d4bd219e4b6f0613616491b2bc2090edcd1e5891e9a32daddb13185f"} Oct 01 11:40:18 crc kubenswrapper[4669]: I1001 11:40:18.037282 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bd7976d-jdmjm" event={"ID":"6264b534-3b2b-44c5-8c01-2ae164ece77e","Type":"ContainerStarted","Data":"c1e60e598cbf8fad3b2f3b619b5ac98938a54081834d27b6d65a2eed907187de"} Oct 01 11:40:18 crc kubenswrapper[4669]: I1001 11:40:18.038593 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" event={"ID":"4a99a9fe-0aaa-496b-97f2-e0964378b735","Type":"ContainerStarted","Data":"5c02573414833a318c7f3dce212c2c644e5fb74d335beb6a43411728b78ad95c"} Oct 01 11:40:18 crc kubenswrapper[4669]: I1001 11:40:18.066467 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bd7976d-jdmjm" podStartSLOduration=2.06643328 podStartE2EDuration="2.06643328s" podCreationTimestamp="2025-10-01 11:40:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:40:18.063688602 +0000 UTC m=+709.163253589" watchObservedRunningTime="2025-10-01 11:40:18.06643328 +0000 UTC m=+709.165998257" Oct 01 11:40:20 crc kubenswrapper[4669]: I1001 11:40:20.067737 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" event={"ID":"39594755-e0c6-4941-ac5c-b847a32459ff","Type":"ContainerStarted","Data":"8359241c14c59b4816de80dabdf99209a147a148c03d5a0ddf87896607b59f74"} Oct 01 11:40:20 crc kubenswrapper[4669]: I1001 11:40:20.090047 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wlrv2" podStartSLOduration=2.07858234 podStartE2EDuration="4.0900303s" podCreationTimestamp="2025-10-01 11:40:16 +0000 UTC" firstStartedPulling="2025-10-01 11:40:16.863950376 +0000 UTC m=+707.963515343" lastFinishedPulling="2025-10-01 11:40:18.875398326 +0000 UTC m=+709.974963303" observedRunningTime="2025-10-01 11:40:20.086734049 +0000 UTC m=+711.186299026" watchObservedRunningTime="2025-10-01 11:40:20.0900303 +0000 UTC m=+711.189595277" Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.087195 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8p9bl" event={"ID":"a2c1c01f-82d8-48e3-a140-14f363594918","Type":"ContainerStarted","Data":"9919c15d1cbdb67867e8775903e79d6e4eae673ee94c22f73a43a2119214b9bb"} Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.087642 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.091385 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" event={"ID":"4a99a9fe-0aaa-496b-97f2-e0964378b735","Type":"ContainerStarted","Data":"5c95a3675e856c59d396ce98a01d24dd5761307204cf5cef0dddb36f029b64c2"} Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.091907 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.094424 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" event={"ID":"10471e2d-ad87-44b7-af2e-b2209ae9337e","Type":"ContainerStarted","Data":"bbab661d6743c89ac5955ef8ab8f7ff39d46b7b2a74c9d4566a49703723eab9f"} Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.122959 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8p9bl" podStartSLOduration=1.895187925 podStartE2EDuration="5.12292251s" podCreationTimestamp="2025-10-01 11:40:16 +0000 UTC" firstStartedPulling="2025-10-01 11:40:16.587683524 +0000 UTC m=+707.687248501" lastFinishedPulling="2025-10-01 11:40:19.815418099 +0000 UTC m=+710.914983086" observedRunningTime="2025-10-01 11:40:21.114718897 +0000 UTC m=+712.214283954" watchObservedRunningTime="2025-10-01 11:40:21.12292251 +0000 UTC m=+712.222487517" Oct 01 11:40:21 crc kubenswrapper[4669]: I1001 11:40:21.146428 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" podStartSLOduration=2.66139633 podStartE2EDuration="5.14639311s" podCreationTimestamp="2025-10-01 11:40:16 +0000 UTC" firstStartedPulling="2025-10-01 11:40:17.351531419 +0000 UTC m=+708.451096396" lastFinishedPulling="2025-10-01 11:40:19.836528179 +0000 UTC m=+710.936093176" observedRunningTime="2025-10-01 11:40:21.145701833 +0000 UTC m=+712.245266810" watchObservedRunningTime="2025-10-01 11:40:21.14639311 +0000 UTC m=+712.245958127" Oct 01 11:40:23 crc kubenswrapper[4669]: I1001 11:40:23.108771 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" event={"ID":"10471e2d-ad87-44b7-af2e-b2209ae9337e","Type":"ContainerStarted","Data":"76178b32e932801b27498ffad2a76b06656c226926aa20c1c50df466d9a7bac2"} Oct 01 11:40:23 crc kubenswrapper[4669]: I1001 11:40:23.137095 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-87fqs" podStartSLOduration=1.329249947 podStartE2EDuration="7.137062338s" podCreationTimestamp="2025-10-01 11:40:16 +0000 UTC" firstStartedPulling="2025-10-01 11:40:16.790036036 +0000 UTC m=+707.889601013" lastFinishedPulling="2025-10-01 11:40:22.597848427 +0000 UTC m=+713.697413404" observedRunningTime="2025-10-01 11:40:23.133178492 +0000 UTC m=+714.232743569" watchObservedRunningTime="2025-10-01 11:40:23.137062338 +0000 UTC m=+714.236627315" Oct 01 11:40:26 crc kubenswrapper[4669]: I1001 11:40:26.564279 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8p9bl" Oct 01 11:40:26 crc kubenswrapper[4669]: I1001 11:40:26.996609 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:26 crc kubenswrapper[4669]: I1001 11:40:26.997316 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:27 crc kubenswrapper[4669]: I1001 11:40:27.007072 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:27 crc kubenswrapper[4669]: I1001 11:40:27.155922 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bd7976d-jdmjm" Oct 01 11:40:27 crc kubenswrapper[4669]: I1001 11:40:27.233987 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cclkd"] Oct 01 11:40:37 crc kubenswrapper[4669]: I1001 11:40:37.080470 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vbw79" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.279673 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-cclkd" podUID="1467a745-44bf-40c6-a065-5008543d1363" containerName="console" containerID="cri-o://fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6" gracePeriod=15 Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.744233 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cclkd_1467a745-44bf-40c6-a065-5008543d1363/console/0.log" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.744592 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.902750 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-service-ca\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.902865 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-console-config\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.902939 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-serving-cert\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.902983 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d2n8\" (UniqueName: \"kubernetes.io/projected/1467a745-44bf-40c6-a065-5008543d1363-kube-api-access-2d2n8\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.903100 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-oauth-serving-cert\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.903258 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-trusted-ca-bundle\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.903384 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-oauth-config\") pod \"1467a745-44bf-40c6-a065-5008543d1363\" (UID: \"1467a745-44bf-40c6-a065-5008543d1363\") " Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.903930 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-console-config" (OuterVolumeSpecName: "console-config") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.904375 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.904425 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-service-ca" (OuterVolumeSpecName: "service-ca") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.904657 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.911131 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.911171 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1467a745-44bf-40c6-a065-5008543d1363-kube-api-access-2d2n8" (OuterVolumeSpecName: "kube-api-access-2d2n8") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "kube-api-access-2d2n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:40:52 crc kubenswrapper[4669]: I1001 11:40:52.911469 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1467a745-44bf-40c6-a065-5008543d1363" (UID: "1467a745-44bf-40c6-a065-5008543d1363"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005750 4669 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005816 4669 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005838 4669 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005859 4669 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005883 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d2n8\" (UniqueName: \"kubernetes.io/projected/1467a745-44bf-40c6-a065-5008543d1363-kube-api-access-2d2n8\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005907 4669 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1467a745-44bf-40c6-a065-5008543d1363-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.005924 4669 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1467a745-44bf-40c6-a065-5008543d1363-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.364752 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cclkd_1467a745-44bf-40c6-a065-5008543d1363/console/0.log" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.364853 4669 generic.go:334] "Generic (PLEG): container finished" podID="1467a745-44bf-40c6-a065-5008543d1363" containerID="fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6" exitCode=2 Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.364906 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cclkd" event={"ID":"1467a745-44bf-40c6-a065-5008543d1363","Type":"ContainerDied","Data":"fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6"} Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.364961 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cclkd" event={"ID":"1467a745-44bf-40c6-a065-5008543d1363","Type":"ContainerDied","Data":"6c458cc069204979441f44c6bdcb90c38013bfa70b6bc3cc74bc1bd934e1730b"} Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.365001 4669 scope.go:117] "RemoveContainer" containerID="fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.365030 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cclkd" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.395681 4669 scope.go:117] "RemoveContainer" containerID="fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6" Oct 01 11:40:53 crc kubenswrapper[4669]: E1001 11:40:53.396646 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6\": container with ID starting with fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6 not found: ID does not exist" containerID="fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.396717 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6"} err="failed to get container status \"fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6\": rpc error: code = NotFound desc = could not find container \"fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6\": container with ID starting with fc36b40eafcf07e38e7246a651f38993b2034fae4529b3bf947c98caa06ed4f6 not found: ID does not exist" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.417994 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cclkd"] Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.423542 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-cclkd"] Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.634132 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq"] Oct 01 11:40:53 crc kubenswrapper[4669]: E1001 11:40:53.634405 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1467a745-44bf-40c6-a065-5008543d1363" containerName="console" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.634419 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1467a745-44bf-40c6-a065-5008543d1363" containerName="console" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.634522 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1467a745-44bf-40c6-a065-5008543d1363" containerName="console" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.635340 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.638791 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.706333 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1467a745-44bf-40c6-a065-5008543d1363" path="/var/lib/kubelet/pods/1467a745-44bf-40c6-a065-5008543d1363/volumes" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.707176 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq"] Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.722496 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.722548 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg68r\" (UniqueName: \"kubernetes.io/projected/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-kube-api-access-xg68r\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.722642 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.824027 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.824182 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.824241 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg68r\" (UniqueName: \"kubernetes.io/projected/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-kube-api-access-xg68r\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.824869 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.825201 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:53 crc kubenswrapper[4669]: I1001 11:40:53.857195 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg68r\" (UniqueName: \"kubernetes.io/projected/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-kube-api-access-xg68r\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:54 crc kubenswrapper[4669]: I1001 11:40:54.004637 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:54 crc kubenswrapper[4669]: I1001 11:40:54.268314 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq"] Oct 01 11:40:54 crc kubenswrapper[4669]: I1001 11:40:54.378617 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" event={"ID":"47f233f9-29d5-4aaa-b9d5-5514aaf44d14","Type":"ContainerStarted","Data":"8f1208cf9aa2b825d11662e07c95c06cd1c78cc03983c999ec8ad74f63d450fe"} Oct 01 11:40:55 crc kubenswrapper[4669]: I1001 11:40:55.388323 4669 generic.go:334] "Generic (PLEG): container finished" podID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerID="9a8d2a250b44988ff30e479245bd4fc6eab923d4d84b48f87dec2a113ec0fa38" exitCode=0 Oct 01 11:40:55 crc kubenswrapper[4669]: I1001 11:40:55.388391 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" event={"ID":"47f233f9-29d5-4aaa-b9d5-5514aaf44d14","Type":"ContainerDied","Data":"9a8d2a250b44988ff30e479245bd4fc6eab923d4d84b48f87dec2a113ec0fa38"} Oct 01 11:40:57 crc kubenswrapper[4669]: I1001 11:40:57.407804 4669 generic.go:334] "Generic (PLEG): container finished" podID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerID="19fcfdf39e331b3ad20348b402e1dffba3974f1e110998cd6e3a0d97d6761273" exitCode=0 Oct 01 11:40:57 crc kubenswrapper[4669]: I1001 11:40:57.407920 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" event={"ID":"47f233f9-29d5-4aaa-b9d5-5514aaf44d14","Type":"ContainerDied","Data":"19fcfdf39e331b3ad20348b402e1dffba3974f1e110998cd6e3a0d97d6761273"} Oct 01 11:40:58 crc kubenswrapper[4669]: I1001 11:40:58.421046 4669 generic.go:334] "Generic (PLEG): container finished" podID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerID="22ce177ba1fa73dde451ed2b036096123772e514b79a1c6bcc6d265a40577bee" exitCode=0 Oct 01 11:40:58 crc kubenswrapper[4669]: I1001 11:40:58.421169 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" event={"ID":"47f233f9-29d5-4aaa-b9d5-5514aaf44d14","Type":"ContainerDied","Data":"22ce177ba1fa73dde451ed2b036096123772e514b79a1c6bcc6d265a40577bee"} Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.533368 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sbrcs"] Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.534134 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" containerID="cri-o://1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e" gracePeriod=30 Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.658237 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf"] Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.659269 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" podUID="b5100377-ee4b-4427-9106-eea735423f5a" containerName="route-controller-manager" containerID="cri-o://c59f9ea783df00bcedf898eeaeef9ed0b7329e6c4cf9e8bac00c0cefd55d3886" gracePeriod=30 Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.731413 4669 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sbrcs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.731488 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.785509 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.924344 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-bundle\") pod \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.924401 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg68r\" (UniqueName: \"kubernetes.io/projected/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-kube-api-access-xg68r\") pod \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.924499 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-util\") pod \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\" (UID: \"47f233f9-29d5-4aaa-b9d5-5514aaf44d14\") " Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.925953 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-bundle" (OuterVolumeSpecName: "bundle") pod "47f233f9-29d5-4aaa-b9d5-5514aaf44d14" (UID: "47f233f9-29d5-4aaa-b9d5-5514aaf44d14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.939294 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-kube-api-access-xg68r" (OuterVolumeSpecName: "kube-api-access-xg68r") pod "47f233f9-29d5-4aaa-b9d5-5514aaf44d14" (UID: "47f233f9-29d5-4aaa-b9d5-5514aaf44d14"). InnerVolumeSpecName "kube-api-access-xg68r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:40:59 crc kubenswrapper[4669]: I1001 11:40:59.960600 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-util" (OuterVolumeSpecName: "util") pod "47f233f9-29d5-4aaa-b9d5-5514aaf44d14" (UID: "47f233f9-29d5-4aaa-b9d5-5514aaf44d14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.026239 4669 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.026276 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg68r\" (UniqueName: \"kubernetes.io/projected/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-kube-api-access-xg68r\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.026293 4669 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47f233f9-29d5-4aaa-b9d5-5514aaf44d14-util\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.086280 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5100377_ee4b_4427_9106_eea735423f5a.slice/crio-conmon-c59f9ea783df00bcedf898eeaeef9ed0b7329e6c4cf9e8bac00c0cefd55d3886.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75bcc3da_1b36_4ee1_860e_787d82ea77e2.slice/crio-conmon-1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e.scope\": RecentStats: unable to find data in memory cache]" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.413574 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.446601 4669 generic.go:334] "Generic (PLEG): container finished" podID="b5100377-ee4b-4427-9106-eea735423f5a" containerID="c59f9ea783df00bcedf898eeaeef9ed0b7329e6c4cf9e8bac00c0cefd55d3886" exitCode=0 Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.446699 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" event={"ID":"b5100377-ee4b-4427-9106-eea735423f5a","Type":"ContainerDied","Data":"c59f9ea783df00bcedf898eeaeef9ed0b7329e6c4cf9e8bac00c0cefd55d3886"} Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.455562 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" event={"ID":"47f233f9-29d5-4aaa-b9d5-5514aaf44d14","Type":"ContainerDied","Data":"8f1208cf9aa2b825d11662e07c95c06cd1c78cc03983c999ec8ad74f63d450fe"} Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.455633 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f1208cf9aa2b825d11662e07c95c06cd1c78cc03983c999ec8ad74f63d450fe" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.458400 4669 generic.go:334] "Generic (PLEG): container finished" podID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerID="1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e" exitCode=0 Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.458450 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.458474 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" event={"ID":"75bcc3da-1b36-4ee1-860e-787d82ea77e2","Type":"ContainerDied","Data":"1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e"} Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.458514 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sbrcs" event={"ID":"75bcc3da-1b36-4ee1-860e-787d82ea77e2","Type":"ContainerDied","Data":"b231f5d22917c84dd540a23b360c258c5cbb3fa95191ab9b3fe56cb743c659eb"} Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.458536 4669 scope.go:117] "RemoveContainer" containerID="1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.458450 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.482751 4669 scope.go:117] "RemoveContainer" containerID="1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e" Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.484259 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e\": container with ID starting with 1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e not found: ID does not exist" containerID="1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.484819 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e"} err="failed to get container status \"1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e\": rpc error: code = NotFound desc = could not find container \"1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e\": container with ID starting with 1d57d4e050d615dbea35a826f9fe3710c3ffa373936704ce873bb9f3bda55d0e not found: ID does not exist" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.538397 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-proxy-ca-bundles\") pod \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.538580 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75bcc3da-1b36-4ee1-860e-787d82ea77e2-serving-cert\") pod \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.538677 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4l7b\" (UniqueName: \"kubernetes.io/projected/75bcc3da-1b36-4ee1-860e-787d82ea77e2-kube-api-access-t4l7b\") pod \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.538729 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-config\") pod \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.538758 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-client-ca\") pod \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\" (UID: \"75bcc3da-1b36-4ee1-860e-787d82ea77e2\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.539492 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "75bcc3da-1b36-4ee1-860e-787d82ea77e2" (UID: "75bcc3da-1b36-4ee1-860e-787d82ea77e2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.539660 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "75bcc3da-1b36-4ee1-860e-787d82ea77e2" (UID: "75bcc3da-1b36-4ee1-860e-787d82ea77e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.542052 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-config" (OuterVolumeSpecName: "config") pod "75bcc3da-1b36-4ee1-860e-787d82ea77e2" (UID: "75bcc3da-1b36-4ee1-860e-787d82ea77e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.542415 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.544969 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75bcc3da-1b36-4ee1-860e-787d82ea77e2-kube-api-access-t4l7b" (OuterVolumeSpecName: "kube-api-access-t4l7b") pod "75bcc3da-1b36-4ee1-860e-787d82ea77e2" (UID: "75bcc3da-1b36-4ee1-860e-787d82ea77e2"). InnerVolumeSpecName "kube-api-access-t4l7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.545491 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75bcc3da-1b36-4ee1-860e-787d82ea77e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "75bcc3da-1b36-4ee1-860e-787d82ea77e2" (UID: "75bcc3da-1b36-4ee1-860e-787d82ea77e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.640003 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-client-ca\") pod \"b5100377-ee4b-4427-9106-eea735423f5a\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.640181 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-config\") pod \"b5100377-ee4b-4427-9106-eea735423f5a\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.640216 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4glcr\" (UniqueName: \"kubernetes.io/projected/b5100377-ee4b-4427-9106-eea735423f5a-kube-api-access-4glcr\") pod \"b5100377-ee4b-4427-9106-eea735423f5a\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.640333 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5100377-ee4b-4427-9106-eea735423f5a-serving-cert\") pod \"b5100377-ee4b-4427-9106-eea735423f5a\" (UID: \"b5100377-ee4b-4427-9106-eea735423f5a\") " Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641355 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75bcc3da-1b36-4ee1-860e-787d82ea77e2-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641378 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4l7b\" (UniqueName: \"kubernetes.io/projected/75bcc3da-1b36-4ee1-860e-787d82ea77e2-kube-api-access-t4l7b\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641390 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641399 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641408 4669 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75bcc3da-1b36-4ee1-860e-787d82ea77e2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641885 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5100377-ee4b-4427-9106-eea735423f5a" (UID: "b5100377-ee4b-4427-9106-eea735423f5a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.641910 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-config" (OuterVolumeSpecName: "config") pod "b5100377-ee4b-4427-9106-eea735423f5a" (UID: "b5100377-ee4b-4427-9106-eea735423f5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.645091 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5100377-ee4b-4427-9106-eea735423f5a-kube-api-access-4glcr" (OuterVolumeSpecName: "kube-api-access-4glcr") pod "b5100377-ee4b-4427-9106-eea735423f5a" (UID: "b5100377-ee4b-4427-9106-eea735423f5a"). InnerVolumeSpecName "kube-api-access-4glcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.645148 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5100377-ee4b-4427-9106-eea735423f5a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5100377-ee4b-4427-9106-eea735423f5a" (UID: "b5100377-ee4b-4427-9106-eea735423f5a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.703052 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8"] Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.707000 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="extract" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.707038 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="extract" Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.707054 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="pull" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.707064 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="pull" Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.707116 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="util" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.707129 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="util" Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.707461 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.707482 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" Oct 01 11:41:00 crc kubenswrapper[4669]: E1001 11:41:00.707503 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5100377-ee4b-4427-9106-eea735423f5a" containerName="route-controller-manager" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.707512 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5100377-ee4b-4427-9106-eea735423f5a" containerName="route-controller-manager" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.713250 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" containerName="controller-manager" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.713321 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f233f9-29d5-4aaa-b9d5-5514aaf44d14" containerName="extract" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.713341 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5100377-ee4b-4427-9106-eea735423f5a" containerName="route-controller-manager" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.714140 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.717221 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8"] Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.743703 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.743732 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5100377-ee4b-4427-9106-eea735423f5a-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.743741 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4glcr\" (UniqueName: \"kubernetes.io/projected/b5100377-ee4b-4427-9106-eea735423f5a-kube-api-access-4glcr\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.743750 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5100377-ee4b-4427-9106-eea735423f5a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.796227 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sbrcs"] Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.804332 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sbrcs"] Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.844968 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-serving-cert\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.845181 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pthlc\" (UniqueName: \"kubernetes.io/projected/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-kube-api-access-pthlc\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.845267 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-client-ca\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.845322 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-config\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.946422 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-serving-cert\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.946518 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pthlc\" (UniqueName: \"kubernetes.io/projected/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-kube-api-access-pthlc\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.946556 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-client-ca\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.946601 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-config\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.947894 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-client-ca\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.948341 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-config\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.957912 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-serving-cert\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:00 crc kubenswrapper[4669]: I1001 11:41:00.969203 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pthlc\" (UniqueName: \"kubernetes.io/projected/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-kube-api-access-pthlc\") pod \"route-controller-manager-5cdccf55c8-dhcc8\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.048762 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.384902 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8"] Oct 01 11:41:01 crc kubenswrapper[4669]: W1001 11:41:01.402438 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f674ec_71e6_41ca_9b24_d1b0bb9036d1.slice/crio-a4b4d4bc0d62671a9e38f69f92447b5e7baed8199bc4095ee1043a71f1fb470b WatchSource:0}: Error finding container a4b4d4bc0d62671a9e38f69f92447b5e7baed8199bc4095ee1043a71f1fb470b: Status 404 returned error can't find the container with id a4b4d4bc0d62671a9e38f69f92447b5e7baed8199bc4095ee1043a71f1fb470b Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.475998 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.476578 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf" event={"ID":"b5100377-ee4b-4427-9106-eea735423f5a","Type":"ContainerDied","Data":"5e75efc9000483c2b7ef2f9e622ceff962ccfeb2cafcc57a4b324bb7fc023f09"} Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.476660 4669 scope.go:117] "RemoveContainer" containerID="c59f9ea783df00bcedf898eeaeef9ed0b7329e6c4cf9e8bac00c0cefd55d3886" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.482232 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" event={"ID":"49f674ec-71e6-41ca-9b24-d1b0bb9036d1","Type":"ContainerStarted","Data":"a4b4d4bc0d62671a9e38f69f92447b5e7baed8199bc4095ee1043a71f1fb470b"} Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.519260 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf"] Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.521901 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2r4jf"] Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.525396 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8"] Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.658670 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75bcc3da-1b36-4ee1-860e-787d82ea77e2" path="/var/lib/kubelet/pods/75bcc3da-1b36-4ee1-860e-787d82ea77e2/volumes" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.660901 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5100377-ee4b-4427-9106-eea735423f5a" path="/var/lib/kubelet/pods/b5100377-ee4b-4427-9106-eea735423f5a/volumes" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.699207 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5646fd5749-8n6m6"] Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.700750 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.704228 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.705195 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.705319 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.705442 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.705998 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.706026 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.712883 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.716154 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5646fd5749-8n6m6"] Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.860042 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7532f418-7d4d-412e-b2ba-a343cf9da659-serving-cert\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.860277 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-proxy-ca-bundles\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.860325 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-client-ca\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.860358 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-config\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.860420 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xln\" (UniqueName: \"kubernetes.io/projected/7532f418-7d4d-412e-b2ba-a343cf9da659-kube-api-access-47xln\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.961894 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7532f418-7d4d-412e-b2ba-a343cf9da659-serving-cert\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.962025 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-proxy-ca-bundles\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.962064 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-client-ca\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.962127 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-config\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.962190 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47xln\" (UniqueName: \"kubernetes.io/projected/7532f418-7d4d-412e-b2ba-a343cf9da659-kube-api-access-47xln\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.963806 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-proxy-ca-bundles\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.964144 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-config\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.964266 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7532f418-7d4d-412e-b2ba-a343cf9da659-client-ca\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.981116 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7532f418-7d4d-412e-b2ba-a343cf9da659-serving-cert\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:01 crc kubenswrapper[4669]: I1001 11:41:01.992881 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xln\" (UniqueName: \"kubernetes.io/projected/7532f418-7d4d-412e-b2ba-a343cf9da659-kube-api-access-47xln\") pod \"controller-manager-5646fd5749-8n6m6\" (UID: \"7532f418-7d4d-412e-b2ba-a343cf9da659\") " pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.025159 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.495324 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" event={"ID":"49f674ec-71e6-41ca-9b24-d1b0bb9036d1","Type":"ContainerStarted","Data":"f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307"} Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.496148 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" podUID="49f674ec-71e6-41ca-9b24-d1b0bb9036d1" containerName="route-controller-manager" containerID="cri-o://f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307" gracePeriod=30 Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.497372 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.507067 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.526019 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5646fd5749-8n6m6"] Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.559422 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" podStartSLOduration=3.559396166 podStartE2EDuration="3.559396166s" podCreationTimestamp="2025-10-01 11:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:41:02.532027419 +0000 UTC m=+753.631592416" watchObservedRunningTime="2025-10-01 11:41:02.559396166 +0000 UTC m=+753.658961143" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.880409 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.931854 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb"] Oct 01 11:41:02 crc kubenswrapper[4669]: E1001 11:41:02.932587 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f674ec-71e6-41ca-9b24-d1b0bb9036d1" containerName="route-controller-manager" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.932601 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f674ec-71e6-41ca-9b24-d1b0bb9036d1" containerName="route-controller-manager" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.932723 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f674ec-71e6-41ca-9b24-d1b0bb9036d1" containerName="route-controller-manager" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.933235 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.949407 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb"] Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.984152 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-client-ca\") pod \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.984235 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-serving-cert\") pod \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.984285 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pthlc\" (UniqueName: \"kubernetes.io/projected/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-kube-api-access-pthlc\") pod \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.984321 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-config\") pod \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\" (UID: \"49f674ec-71e6-41ca-9b24-d1b0bb9036d1\") " Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.985791 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-config" (OuterVolumeSpecName: "config") pod "49f674ec-71e6-41ca-9b24-d1b0bb9036d1" (UID: "49f674ec-71e6-41ca-9b24-d1b0bb9036d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.986062 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-client-ca" (OuterVolumeSpecName: "client-ca") pod "49f674ec-71e6-41ca-9b24-d1b0bb9036d1" (UID: "49f674ec-71e6-41ca-9b24-d1b0bb9036d1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.992442 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49f674ec-71e6-41ca-9b24-d1b0bb9036d1" (UID: "49f674ec-71e6-41ca-9b24-d1b0bb9036d1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:41:02 crc kubenswrapper[4669]: I1001 11:41:02.993539 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-kube-api-access-pthlc" (OuterVolumeSpecName: "kube-api-access-pthlc") pod "49f674ec-71e6-41ca-9b24-d1b0bb9036d1" (UID: "49f674ec-71e6-41ca-9b24-d1b0bb9036d1"). InnerVolumeSpecName "kube-api-access-pthlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.087581 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188f65fe-9209-4971-a9ac-ae3158e77de4-serving-cert\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.087654 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188f65fe-9209-4971-a9ac-ae3158e77de4-client-ca\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.087673 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86442\" (UniqueName: \"kubernetes.io/projected/188f65fe-9209-4971-a9ac-ae3158e77de4-kube-api-access-86442\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.088140 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f65fe-9209-4971-a9ac-ae3158e77de4-config\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.088369 4669 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.088394 4669 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.088408 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pthlc\" (UniqueName: \"kubernetes.io/projected/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-kube-api-access-pthlc\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.088425 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f674ec-71e6-41ca-9b24-d1b0bb9036d1-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.189801 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f65fe-9209-4971-a9ac-ae3158e77de4-config\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.189868 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188f65fe-9209-4971-a9ac-ae3158e77de4-serving-cert\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.189904 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188f65fe-9209-4971-a9ac-ae3158e77de4-client-ca\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.189945 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86442\" (UniqueName: \"kubernetes.io/projected/188f65fe-9209-4971-a9ac-ae3158e77de4-kube-api-access-86442\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.191259 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/188f65fe-9209-4971-a9ac-ae3158e77de4-client-ca\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.191390 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f65fe-9209-4971-a9ac-ae3158e77de4-config\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.195445 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/188f65fe-9209-4971-a9ac-ae3158e77de4-serving-cert\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.211830 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86442\" (UniqueName: \"kubernetes.io/projected/188f65fe-9209-4971-a9ac-ae3158e77de4-kube-api-access-86442\") pod \"route-controller-manager-77fc4bcc65-mf7lb\" (UID: \"188f65fe-9209-4971-a9ac-ae3158e77de4\") " pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.253058 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.501422 4669 generic.go:334] "Generic (PLEG): container finished" podID="49f674ec-71e6-41ca-9b24-d1b0bb9036d1" containerID="f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307" exitCode=0 Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.501564 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" event={"ID":"49f674ec-71e6-41ca-9b24-d1b0bb9036d1","Type":"ContainerDied","Data":"f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307"} Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.501636 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" event={"ID":"49f674ec-71e6-41ca-9b24-d1b0bb9036d1","Type":"ContainerDied","Data":"a4b4d4bc0d62671a9e38f69f92447b5e7baed8199bc4095ee1043a71f1fb470b"} Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.501669 4669 scope.go:117] "RemoveContainer" containerID="f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.501821 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.508206 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" event={"ID":"7532f418-7d4d-412e-b2ba-a343cf9da659","Type":"ContainerStarted","Data":"b5ea872530a502329a8ae46364b253f4a7270ff04efde8dfd11c29ea104cd68a"} Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.508271 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" event={"ID":"7532f418-7d4d-412e-b2ba-a343cf9da659","Type":"ContainerStarted","Data":"f77e7dbe2c7d15a7ac72af7ba936e0e3a1dc0bc0b1144c5d9b95ed6a89fab435"} Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.508569 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.514022 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.538606 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5646fd5749-8n6m6" podStartSLOduration=4.538582746 podStartE2EDuration="4.538582746s" podCreationTimestamp="2025-10-01 11:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:41:03.536157166 +0000 UTC m=+754.635722173" watchObservedRunningTime="2025-10-01 11:41:03.538582746 +0000 UTC m=+754.638147733" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.543180 4669 scope.go:117] "RemoveContainer" containerID="f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307" Oct 01 11:41:03 crc kubenswrapper[4669]: E1001 11:41:03.543712 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307\": container with ID starting with f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307 not found: ID does not exist" containerID="f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.543763 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307"} err="failed to get container status \"f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307\": rpc error: code = NotFound desc = could not find container \"f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307\": container with ID starting with f2efdf09439fe07e6d16c97990272614a0e14be91b58086110d7e6f1436c7307 not found: ID does not exist" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.552664 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8"] Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.555435 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cdccf55c8-dhcc8"] Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.650955 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f674ec-71e6-41ca-9b24-d1b0bb9036d1" path="/var/lib/kubelet/pods/49f674ec-71e6-41ca-9b24-d1b0bb9036d1/volumes" Oct 01 11:41:03 crc kubenswrapper[4669]: I1001 11:41:03.706632 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb"] Oct 01 11:41:03 crc kubenswrapper[4669]: W1001 11:41:03.710901 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188f65fe_9209_4971_a9ac_ae3158e77de4.slice/crio-4fd36959dd3a6b006f97b50b0899542be72e5776ec30e5070967ca40901198be WatchSource:0}: Error finding container 4fd36959dd3a6b006f97b50b0899542be72e5776ec30e5070967ca40901198be: Status 404 returned error can't find the container with id 4fd36959dd3a6b006f97b50b0899542be72e5776ec30e5070967ca40901198be Oct 01 11:41:04 crc kubenswrapper[4669]: I1001 11:41:04.516875 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" event={"ID":"188f65fe-9209-4971-a9ac-ae3158e77de4","Type":"ContainerStarted","Data":"e0e52766571c6c54aa7dfe8864418bb2a854d47592fa54196ce44673cc74209a"} Oct 01 11:41:04 crc kubenswrapper[4669]: I1001 11:41:04.517324 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" event={"ID":"188f65fe-9209-4971-a9ac-ae3158e77de4","Type":"ContainerStarted","Data":"4fd36959dd3a6b006f97b50b0899542be72e5776ec30e5070967ca40901198be"} Oct 01 11:41:04 crc kubenswrapper[4669]: I1001 11:41:04.520461 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:04 crc kubenswrapper[4669]: I1001 11:41:04.542385 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" podStartSLOduration=3.542354883 podStartE2EDuration="3.542354883s" podCreationTimestamp="2025-10-01 11:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:41:04.539684028 +0000 UTC m=+755.639248995" watchObservedRunningTime="2025-10-01 11:41:04.542354883 +0000 UTC m=+755.641919890" Oct 01 11:41:04 crc kubenswrapper[4669]: I1001 11:41:04.579892 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77fc4bcc65-mf7lb" Oct 01 11:41:07 crc kubenswrapper[4669]: I1001 11:41:07.865061 4669 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.781060 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r"] Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.782153 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.788386 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.788473 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.788808 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.789127 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6dltt" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.792302 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.809747 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r"] Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.956482 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/562d1f16-7779-4cfb-ae80-5bad719475d1-webhook-cert\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.956555 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/562d1f16-7779-4cfb-ae80-5bad719475d1-apiservice-cert\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:11 crc kubenswrapper[4669]: I1001 11:41:11.956601 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ntp\" (UniqueName: \"kubernetes.io/projected/562d1f16-7779-4cfb-ae80-5bad719475d1-kube-api-access-85ntp\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.057866 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/562d1f16-7779-4cfb-ae80-5bad719475d1-apiservice-cert\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.058339 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ntp\" (UniqueName: \"kubernetes.io/projected/562d1f16-7779-4cfb-ae80-5bad719475d1-kube-api-access-85ntp\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.058553 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/562d1f16-7779-4cfb-ae80-5bad719475d1-webhook-cert\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.066933 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/562d1f16-7779-4cfb-ae80-5bad719475d1-webhook-cert\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.067102 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/562d1f16-7779-4cfb-ae80-5bad719475d1-apiservice-cert\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.096766 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ntp\" (UniqueName: \"kubernetes.io/projected/562d1f16-7779-4cfb-ae80-5bad719475d1-kube-api-access-85ntp\") pod \"metallb-operator-controller-manager-6774cc6d74-d656r\" (UID: \"562d1f16-7779-4cfb-ae80-5bad719475d1\") " pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.114240 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w"] Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.115048 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: W1001 11:41:12.117211 4669 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 11:41:12 crc kubenswrapper[4669]: E1001 11:41:12.117261 4669 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 11:41:12 crc kubenswrapper[4669]: W1001 11:41:12.117416 4669 reflector.go:561] object-"metallb-system"/"controller-dockercfg-2wwkr": failed to list *v1.Secret: secrets "controller-dockercfg-2wwkr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 11:41:12 crc kubenswrapper[4669]: E1001 11:41:12.117439 4669 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-2wwkr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-2wwkr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 11:41:12 crc kubenswrapper[4669]: W1001 11:41:12.117478 4669 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 11:41:12 crc kubenswrapper[4669]: E1001 11:41:12.117492 4669 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.131686 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w"] Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.160025 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-webhook-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.160373 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-apiservice-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.160498 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4m9\" (UniqueName: \"kubernetes.io/projected/c8793365-44bd-4d00-aa95-2d23bd134f23-kube-api-access-nc4m9\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.261663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4m9\" (UniqueName: \"kubernetes.io/projected/c8793365-44bd-4d00-aa95-2d23bd134f23-kube-api-access-nc4m9\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.261723 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-webhook-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.261774 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-apiservice-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.279877 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4m9\" (UniqueName: \"kubernetes.io/projected/c8793365-44bd-4d00-aa95-2d23bd134f23-kube-api-access-nc4m9\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.397637 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:12 crc kubenswrapper[4669]: I1001 11:41:12.906975 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r"] Oct 01 11:41:12 crc kubenswrapper[4669]: W1001 11:41:12.917574 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562d1f16_7779_4cfb_ae80_5bad719475d1.slice/crio-dde256dabcfa1fc42d1bb29ca3f5544fab8d378d6f6eaf995161e4e87cea093a WatchSource:0}: Error finding container dde256dabcfa1fc42d1bb29ca3f5544fab8d378d6f6eaf995161e4e87cea093a: Status 404 returned error can't find the container with id dde256dabcfa1fc42d1bb29ca3f5544fab8d378d6f6eaf995161e4e87cea093a Oct 01 11:41:13 crc kubenswrapper[4669]: E1001 11:41:13.262107 4669 secret.go:188] Couldn't get secret metallb-system/metallb-operator-webhook-server-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 11:41:13 crc kubenswrapper[4669]: E1001 11:41:13.262226 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-webhook-cert podName:c8793365-44bd-4d00-aa95-2d23bd134f23 nodeName:}" failed. No retries permitted until 2025-10-01 11:41:13.762192339 +0000 UTC m=+764.861757326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-webhook-cert") pod "metallb-operator-webhook-server-7c796f5894-wqh8w" (UID: "c8793365-44bd-4d00-aa95-2d23bd134f23") : failed to sync secret cache: timed out waiting for the condition Oct 01 11:41:13 crc kubenswrapper[4669]: E1001 11:41:13.262547 4669 secret.go:188] Couldn't get secret metallb-system/metallb-operator-webhook-server-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 11:41:13 crc kubenswrapper[4669]: E1001 11:41:13.262616 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-apiservice-cert podName:c8793365-44bd-4d00-aa95-2d23bd134f23 nodeName:}" failed. No retries permitted until 2025-10-01 11:41:13.7626034 +0000 UTC m=+764.862168387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-apiservice-cert") pod "metallb-operator-webhook-server-7c796f5894-wqh8w" (UID: "c8793365-44bd-4d00-aa95-2d23bd134f23") : failed to sync secret cache: timed out waiting for the condition Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.315740 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.398725 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.590862 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" event={"ID":"562d1f16-7779-4cfb-ae80-5bad719475d1","Type":"ContainerStarted","Data":"dde256dabcfa1fc42d1bb29ca3f5544fab8d378d6f6eaf995161e4e87cea093a"} Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.665612 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2wwkr" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.782796 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-webhook-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.782899 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-apiservice-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.798090 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-apiservice-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.798647 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c8793365-44bd-4d00-aa95-2d23bd134f23-webhook-cert\") pod \"metallb-operator-webhook-server-7c796f5894-wqh8w\" (UID: \"c8793365-44bd-4d00-aa95-2d23bd134f23\") " pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:13 crc kubenswrapper[4669]: I1001 11:41:13.929852 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:14 crc kubenswrapper[4669]: I1001 11:41:14.359454 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w"] Oct 01 11:41:14 crc kubenswrapper[4669]: I1001 11:41:14.599994 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" event={"ID":"c8793365-44bd-4d00-aa95-2d23bd134f23","Type":"ContainerStarted","Data":"4621cfc12aa40a487cc0e8994c3f66271458a8b8fc4a9f33b8fd68fcd9c1a3bb"} Oct 01 11:41:16 crc kubenswrapper[4669]: I1001 11:41:16.618426 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" event={"ID":"562d1f16-7779-4cfb-ae80-5bad719475d1","Type":"ContainerStarted","Data":"d7fea67c6374642f3fe070560fd5742c7cf63c6bf9e982f5b63df421bc84f1f6"} Oct 01 11:41:16 crc kubenswrapper[4669]: I1001 11:41:16.619030 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:16 crc kubenswrapper[4669]: I1001 11:41:16.649327 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" podStartSLOduration=2.465416413 podStartE2EDuration="5.649311874s" podCreationTimestamp="2025-10-01 11:41:11 +0000 UTC" firstStartedPulling="2025-10-01 11:41:12.924759806 +0000 UTC m=+764.024324773" lastFinishedPulling="2025-10-01 11:41:16.108655257 +0000 UTC m=+767.208220234" observedRunningTime="2025-10-01 11:41:16.648550646 +0000 UTC m=+767.748115623" watchObservedRunningTime="2025-10-01 11:41:16.649311874 +0000 UTC m=+767.748876851" Oct 01 11:41:19 crc kubenswrapper[4669]: I1001 11:41:19.640574 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" event={"ID":"c8793365-44bd-4d00-aa95-2d23bd134f23","Type":"ContainerStarted","Data":"bb4e6b73e2e11d1af27a3a1f2533d65b3589156a27c15ceac493812d0513db1e"} Oct 01 11:41:19 crc kubenswrapper[4669]: I1001 11:41:19.641386 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:19 crc kubenswrapper[4669]: I1001 11:41:19.664716 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" podStartSLOduration=2.636704304 podStartE2EDuration="7.664692189s" podCreationTimestamp="2025-10-01 11:41:12 +0000 UTC" firstStartedPulling="2025-10-01 11:41:14.395992632 +0000 UTC m=+765.495557609" lastFinishedPulling="2025-10-01 11:41:19.423980507 +0000 UTC m=+770.523545494" observedRunningTime="2025-10-01 11:41:19.661943921 +0000 UTC m=+770.761508938" watchObservedRunningTime="2025-10-01 11:41:19.664692189 +0000 UTC m=+770.764257166" Oct 01 11:41:31 crc kubenswrapper[4669]: I1001 11:41:31.863967 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:41:31 crc kubenswrapper[4669]: I1001 11:41:31.865012 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:41:33 crc kubenswrapper[4669]: I1001 11:41:33.935468 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c796f5894-wqh8w" Oct 01 11:41:52 crc kubenswrapper[4669]: I1001 11:41:52.402165 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6774cc6d74-d656r" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.276400 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4lq96"] Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.280139 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.286385 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb"] Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.287359 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.306155 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.306276 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.306547 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ph4t4" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.307540 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.321066 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb"] Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357467 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bjr\" (UniqueName: \"kubernetes.io/projected/b38f6785-4644-476d-9014-3ad44957a952-kube-api-access-78bjr\") pod \"frr-k8s-webhook-server-5478bdb765-pnqmb\" (UID: \"b38f6785-4644-476d-9014-3ad44957a952\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357528 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357557 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnqg\" (UniqueName: \"kubernetes.io/projected/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-kube-api-access-sdnqg\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357590 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics-certs\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357616 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-reloader\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357641 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-conf\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357658 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-sockets\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357688 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b38f6785-4644-476d-9014-3ad44957a952-cert\") pod \"frr-k8s-webhook-server-5478bdb765-pnqmb\" (UID: \"b38f6785-4644-476d-9014-3ad44957a952\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.357714 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-startup\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.407847 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-t6f9w"] Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.409311 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.413204 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-8kstm"] Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.414296 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.416541 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-8kstm"] Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.454550 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.454722 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qgsxb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.454981 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.455000 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.455326 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459046 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dq66\" (UniqueName: \"kubernetes.io/projected/05969ea4-e97c-4b66-aa70-c4909a58472b-kube-api-access-6dq66\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459118 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-startup\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459154 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-cert\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459191 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459226 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metallb-excludel2\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459269 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78bjr\" (UniqueName: \"kubernetes.io/projected/b38f6785-4644-476d-9014-3ad44957a952-kube-api-access-78bjr\") pod \"frr-k8s-webhook-server-5478bdb765-pnqmb\" (UID: \"b38f6785-4644-476d-9014-3ad44957a952\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459318 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459357 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnqg\" (UniqueName: \"kubernetes.io/projected/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-kube-api-access-sdnqg\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459408 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics-certs\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459446 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metrics-certs\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459487 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-reloader\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459531 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx9jp\" (UniqueName: \"kubernetes.io/projected/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-kube-api-access-zx9jp\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459569 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-metrics-certs\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459623 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-conf\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459660 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-sockets\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.459717 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b38f6785-4644-476d-9014-3ad44957a952-cert\") pod \"frr-k8s-webhook-server-5478bdb765-pnqmb\" (UID: \"b38f6785-4644-476d-9014-3ad44957a952\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.461311 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-startup\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.461322 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.461576 4669 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.461693 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics-certs podName:204f5c6d-d71c-4ab6-bfc8-a8682b4e997b nodeName:}" failed. No retries permitted until 2025-10-01 11:41:53.961670426 +0000 UTC m=+805.061235403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics-certs") pod "frr-k8s-4lq96" (UID: "204f5c6d-d71c-4ab6-bfc8-a8682b4e997b") : secret "frr-k8s-certs-secret" not found Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.461757 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-reloader\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.461918 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-sockets\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.462039 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-frr-conf\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.471442 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b38f6785-4644-476d-9014-3ad44957a952-cert\") pod \"frr-k8s-webhook-server-5478bdb765-pnqmb\" (UID: \"b38f6785-4644-476d-9014-3ad44957a952\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.490604 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bjr\" (UniqueName: \"kubernetes.io/projected/b38f6785-4644-476d-9014-3ad44957a952-kube-api-access-78bjr\") pod \"frr-k8s-webhook-server-5478bdb765-pnqmb\" (UID: \"b38f6785-4644-476d-9014-3ad44957a952\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.501348 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnqg\" (UniqueName: \"kubernetes.io/projected/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-kube-api-access-sdnqg\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561064 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dq66\" (UniqueName: \"kubernetes.io/projected/05969ea4-e97c-4b66-aa70-c4909a58472b-kube-api-access-6dq66\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561150 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-cert\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561184 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561205 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metallb-excludel2\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561278 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metrics-certs\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561299 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx9jp\" (UniqueName: \"kubernetes.io/projected/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-kube-api-access-zx9jp\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.561320 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-metrics-certs\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.561495 4669 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.561556 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-metrics-certs podName:05969ea4-e97c-4b66-aa70-c4909a58472b nodeName:}" failed. No retries permitted until 2025-10-01 11:41:54.061538126 +0000 UTC m=+805.161103103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-metrics-certs") pod "controller-5d688f5ffc-8kstm" (UID: "05969ea4-e97c-4b66-aa70-c4909a58472b") : secret "controller-certs-secret" not found Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.561668 4669 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.561724 4669 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.561758 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metrics-certs podName:fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230 nodeName:}" failed. No retries permitted until 2025-10-01 11:41:54.061737371 +0000 UTC m=+805.161302348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metrics-certs") pod "speaker-t6f9w" (UID: "fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230") : secret "speaker-certs-secret" not found Oct 01 11:41:53 crc kubenswrapper[4669]: E1001 11:41:53.561805 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist podName:fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230 nodeName:}" failed. No retries permitted until 2025-10-01 11:41:54.061776422 +0000 UTC m=+805.161341399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist") pod "speaker-t6f9w" (UID: "fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230") : secret "metallb-memberlist" not found Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.562656 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metallb-excludel2\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.563800 4669 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.576160 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-cert\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.580382 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx9jp\" (UniqueName: \"kubernetes.io/projected/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-kube-api-access-zx9jp\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.592723 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dq66\" (UniqueName: \"kubernetes.io/projected/05969ea4-e97c-4b66-aa70-c4909a58472b-kube-api-access-6dq66\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.622881 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.966270 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics-certs\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:53 crc kubenswrapper[4669]: I1001 11:41:53.971062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/204f5c6d-d71c-4ab6-bfc8-a8682b4e997b-metrics-certs\") pod \"frr-k8s-4lq96\" (UID: \"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b\") " pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.068744 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metrics-certs\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.070039 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-metrics-certs\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.070219 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:54 crc kubenswrapper[4669]: E1001 11:41:54.070372 4669 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 11:41:54 crc kubenswrapper[4669]: E1001 11:41:54.070478 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist podName:fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230 nodeName:}" failed. No retries permitted until 2025-10-01 11:41:55.070441719 +0000 UTC m=+806.170006736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist") pod "speaker-t6f9w" (UID: "fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230") : secret "metallb-memberlist" not found Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.074737 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-metrics-certs\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.075743 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05969ea4-e97c-4b66-aa70-c4909a58472b-metrics-certs\") pod \"controller-5d688f5ffc-8kstm\" (UID: \"05969ea4-e97c-4b66-aa70-c4909a58472b\") " pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.099182 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb"] Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.135803 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.215099 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4lq96" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.403797 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-8kstm"] Oct 01 11:41:54 crc kubenswrapper[4669]: W1001 11:41:54.407141 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05969ea4_e97c_4b66_aa70_c4909a58472b.slice/crio-f03e6782c6e57e0276a7fa188025deee03725b9b2adcaa9edc6b9c503753ccc3 WatchSource:0}: Error finding container f03e6782c6e57e0276a7fa188025deee03725b9b2adcaa9edc6b9c503753ccc3: Status 404 returned error can't find the container with id f03e6782c6e57e0276a7fa188025deee03725b9b2adcaa9edc6b9c503753ccc3 Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.903197 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-8kstm" event={"ID":"05969ea4-e97c-4b66-aa70-c4909a58472b","Type":"ContainerStarted","Data":"5abcaccdc343045324dfdb11248bad433c5bb9da29405e68e882f5ca5dd2ed90"} Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.903690 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-8kstm" event={"ID":"05969ea4-e97c-4b66-aa70-c4909a58472b","Type":"ContainerStarted","Data":"06c3dc68ff8e97f54baf58faae75b98a47b6713104c12b5f75947f8824382eae"} Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.903714 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.903728 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-8kstm" event={"ID":"05969ea4-e97c-4b66-aa70-c4909a58472b","Type":"ContainerStarted","Data":"f03e6782c6e57e0276a7fa188025deee03725b9b2adcaa9edc6b9c503753ccc3"} Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.904148 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" event={"ID":"b38f6785-4644-476d-9014-3ad44957a952","Type":"ContainerStarted","Data":"ad6920b6159daf182f2acdf9b314f46e56d13294b6be6301544ee9c67bf5237c"} Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.905332 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"4db56d0769c5ea0cc2ef992dc729cf74c797e3bdb0c960a1e10fe0a90e7732a1"} Oct 01 11:41:54 crc kubenswrapper[4669]: I1001 11:41:54.926196 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-8kstm" podStartSLOduration=1.926176228 podStartE2EDuration="1.926176228s" podCreationTimestamp="2025-10-01 11:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:41:54.922551739 +0000 UTC m=+806.022116716" watchObservedRunningTime="2025-10-01 11:41:54.926176228 +0000 UTC m=+806.025741205" Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.090587 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.100280 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230-memberlist\") pod \"speaker-t6f9w\" (UID: \"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230\") " pod="metallb-system/speaker-t6f9w" Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.265546 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-t6f9w" Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.924734 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t6f9w" event={"ID":"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230","Type":"ContainerStarted","Data":"63bc756847c28080b8f405707f6a4e0cc73e16e7b1b3c68d7212a526e7624161"} Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.925102 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t6f9w" event={"ID":"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230","Type":"ContainerStarted","Data":"3f4236b4d58c28cbca80d188ea164d6b6d7d088bbdadef4087ff2a35aae3241a"} Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.925117 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t6f9w" event={"ID":"fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230","Type":"ContainerStarted","Data":"311cbaa89fd62d124c0bbf420646e6f2460814a9e120daae1aae07cc23312e5d"} Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.925531 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-t6f9w" Oct 01 11:41:55 crc kubenswrapper[4669]: I1001 11:41:55.953384 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-t6f9w" podStartSLOduration=2.953360145 podStartE2EDuration="2.953360145s" podCreationTimestamp="2025-10-01 11:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:41:55.948365231 +0000 UTC m=+807.047930208" watchObservedRunningTime="2025-10-01 11:41:55.953360145 +0000 UTC m=+807.052925122" Oct 01 11:42:01 crc kubenswrapper[4669]: I1001 11:42:01.864538 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:42:01 crc kubenswrapper[4669]: I1001 11:42:01.872314 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:42:01 crc kubenswrapper[4669]: I1001 11:42:01.980132 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" event={"ID":"b38f6785-4644-476d-9014-3ad44957a952","Type":"ContainerStarted","Data":"9fa7b2dfb52067c7e5725b7d7258c4853c9c4df2e163752cf8864d5a4834f6c2"} Oct 01 11:42:01 crc kubenswrapper[4669]: I1001 11:42:01.980544 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:42:01 crc kubenswrapper[4669]: I1001 11:42:01.984719 4669 generic.go:334] "Generic (PLEG): container finished" podID="204f5c6d-d71c-4ab6-bfc8-a8682b4e997b" containerID="12b1bc2102563568d547c4c18a3545de83850dec2eafa89f21a7190d10c09124" exitCode=0 Oct 01 11:42:01 crc kubenswrapper[4669]: I1001 11:42:01.984911 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerDied","Data":"12b1bc2102563568d547c4c18a3545de83850dec2eafa89f21a7190d10c09124"} Oct 01 11:42:02 crc kubenswrapper[4669]: I1001 11:42:02.021175 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" podStartSLOduration=1.595001269 podStartE2EDuration="9.021146219s" podCreationTimestamp="2025-10-01 11:41:53 +0000 UTC" firstStartedPulling="2025-10-01 11:41:54.112436567 +0000 UTC m=+805.212001594" lastFinishedPulling="2025-10-01 11:42:01.538581557 +0000 UTC m=+812.638146544" observedRunningTime="2025-10-01 11:42:02.014907034 +0000 UTC m=+813.114472061" watchObservedRunningTime="2025-10-01 11:42:02.021146219 +0000 UTC m=+813.120711206" Oct 01 11:42:02 crc kubenswrapper[4669]: I1001 11:42:02.996173 4669 generic.go:334] "Generic (PLEG): container finished" podID="204f5c6d-d71c-4ab6-bfc8-a8682b4e997b" containerID="515f57ccd8b6f90c14ef3b4c4d5a33f72c7b5d649b2d86f52f0ea86696056b68" exitCode=0 Oct 01 11:42:02 crc kubenswrapper[4669]: I1001 11:42:02.996311 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerDied","Data":"515f57ccd8b6f90c14ef3b4c4d5a33f72c7b5d649b2d86f52f0ea86696056b68"} Oct 01 11:42:04 crc kubenswrapper[4669]: I1001 11:42:04.006346 4669 generic.go:334] "Generic (PLEG): container finished" podID="204f5c6d-d71c-4ab6-bfc8-a8682b4e997b" containerID="4a51193ca7ad2eaafd08b88aaa0197050ca0612b80d9512b53b13aa595491725" exitCode=0 Oct 01 11:42:04 crc kubenswrapper[4669]: I1001 11:42:04.006423 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerDied","Data":"4a51193ca7ad2eaafd08b88aaa0197050ca0612b80d9512b53b13aa595491725"} Oct 01 11:42:04 crc kubenswrapper[4669]: I1001 11:42:04.143557 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-8kstm" Oct 01 11:42:05 crc kubenswrapper[4669]: I1001 11:42:05.024733 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"bb413dcb2ef30f7a83621969f0604eacf0b37af1ac0fac264a0eb4f0cef2a9a0"} Oct 01 11:42:05 crc kubenswrapper[4669]: I1001 11:42:05.025257 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"02002d40fa96f8395cc58f0b71dba90df6b8c74bb03981ea0ffcf5fb45e8c03a"} Oct 01 11:42:05 crc kubenswrapper[4669]: I1001 11:42:05.025278 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"f572242757cf6cee842b2ef138a07ddff0231e9b3db62bf787fa0e81d2da7b94"} Oct 01 11:42:05 crc kubenswrapper[4669]: I1001 11:42:05.025294 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"14b0a5e16f14a03f7b042febe20a79de5587ba4c8a20e7a3dbdcd6c68a696243"} Oct 01 11:42:05 crc kubenswrapper[4669]: I1001 11:42:05.025310 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"ef9a84162b6606b27f16fd1f6fd8e62192b560fc8ee70cbfaa7e4a559c4f6379"} Oct 01 11:42:05 crc kubenswrapper[4669]: I1001 11:42:05.271635 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-t6f9w" Oct 01 11:42:06 crc kubenswrapper[4669]: I1001 11:42:06.057686 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4lq96" event={"ID":"204f5c6d-d71c-4ab6-bfc8-a8682b4e997b","Type":"ContainerStarted","Data":"e0372be0c5cc03802ab1c988325f0abf516b70c05bf0e7b2ffc2d1445a5c18f5"} Oct 01 11:42:06 crc kubenswrapper[4669]: I1001 11:42:06.058956 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4lq96" Oct 01 11:42:06 crc kubenswrapper[4669]: I1001 11:42:06.099911 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4lq96" podStartSLOduration=5.918563288 podStartE2EDuration="13.099891684s" podCreationTimestamp="2025-10-01 11:41:53 +0000 UTC" firstStartedPulling="2025-10-01 11:41:54.394160483 +0000 UTC m=+805.493725470" lastFinishedPulling="2025-10-01 11:42:01.575488879 +0000 UTC m=+812.675053866" observedRunningTime="2025-10-01 11:42:06.09649284 +0000 UTC m=+817.196057837" watchObservedRunningTime="2025-10-01 11:42:06.099891684 +0000 UTC m=+817.199456661" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.405816 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lg4p2"] Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.406971 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.410389 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.410433 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-q2dcs" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.411064 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.426470 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lg4p2"] Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.517379 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnhr\" (UniqueName: \"kubernetes.io/projected/672d925c-cea5-49b4-ba73-3e1fb441f1d7-kube-api-access-srnhr\") pod \"openstack-operator-index-lg4p2\" (UID: \"672d925c-cea5-49b4-ba73-3e1fb441f1d7\") " pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.619191 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srnhr\" (UniqueName: \"kubernetes.io/projected/672d925c-cea5-49b4-ba73-3e1fb441f1d7-kube-api-access-srnhr\") pod \"openstack-operator-index-lg4p2\" (UID: \"672d925c-cea5-49b4-ba73-3e1fb441f1d7\") " pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.637400 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnhr\" (UniqueName: \"kubernetes.io/projected/672d925c-cea5-49b4-ba73-3e1fb441f1d7-kube-api-access-srnhr\") pod \"openstack-operator-index-lg4p2\" (UID: \"672d925c-cea5-49b4-ba73-3e1fb441f1d7\") " pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:08 crc kubenswrapper[4669]: I1001 11:42:08.722336 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:09 crc kubenswrapper[4669]: I1001 11:42:09.178902 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lg4p2"] Oct 01 11:42:09 crc kubenswrapper[4669]: W1001 11:42:09.193928 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod672d925c_cea5_49b4_ba73_3e1fb441f1d7.slice/crio-bb1d115db33a87b56304372f576a58408297e73d9f65e429355b9db22411a695 WatchSource:0}: Error finding container bb1d115db33a87b56304372f576a58408297e73d9f65e429355b9db22411a695: Status 404 returned error can't find the container with id bb1d115db33a87b56304372f576a58408297e73d9f65e429355b9db22411a695 Oct 01 11:42:09 crc kubenswrapper[4669]: I1001 11:42:09.216057 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4lq96" Oct 01 11:42:09 crc kubenswrapper[4669]: I1001 11:42:09.259883 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4lq96" Oct 01 11:42:10 crc kubenswrapper[4669]: I1001 11:42:10.088676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lg4p2" event={"ID":"672d925c-cea5-49b4-ba73-3e1fb441f1d7","Type":"ContainerStarted","Data":"bb1d115db33a87b56304372f576a58408297e73d9f65e429355b9db22411a695"} Oct 01 11:42:11 crc kubenswrapper[4669]: I1001 11:42:11.789616 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lg4p2"] Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.403809 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fmzck"] Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.405453 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.416490 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fmzck"] Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.496255 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxj6\" (UniqueName: \"kubernetes.io/projected/684a045b-062b-4989-85cf-f621d5c88f39-kube-api-access-pgxj6\") pod \"openstack-operator-index-fmzck\" (UID: \"684a045b-062b-4989-85cf-f621d5c88f39\") " pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.597653 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxj6\" (UniqueName: \"kubernetes.io/projected/684a045b-062b-4989-85cf-f621d5c88f39-kube-api-access-pgxj6\") pod \"openstack-operator-index-fmzck\" (UID: \"684a045b-062b-4989-85cf-f621d5c88f39\") " pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.620830 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxj6\" (UniqueName: \"kubernetes.io/projected/684a045b-062b-4989-85cf-f621d5c88f39-kube-api-access-pgxj6\") pod \"openstack-operator-index-fmzck\" (UID: \"684a045b-062b-4989-85cf-f621d5c88f39\") " pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:12 crc kubenswrapper[4669]: I1001 11:42:12.736517 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:13 crc kubenswrapper[4669]: I1001 11:42:13.409939 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fmzck"] Oct 01 11:42:13 crc kubenswrapper[4669]: I1001 11:42:13.628645 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-pnqmb" Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.129676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fmzck" event={"ID":"684a045b-062b-4989-85cf-f621d5c88f39","Type":"ContainerStarted","Data":"f47f64c33a63c17e5620172440b6a7e391a1a0fb202814452d915c3ac6757809"} Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.129769 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fmzck" event={"ID":"684a045b-062b-4989-85cf-f621d5c88f39","Type":"ContainerStarted","Data":"f62cc3319013fb6a7851f72080eddc0b8e57e0e5bd55b35977c3f05f56fdb1d1"} Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.134871 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lg4p2" event={"ID":"672d925c-cea5-49b4-ba73-3e1fb441f1d7","Type":"ContainerStarted","Data":"e577031694b1bb6c0957c2e4ec8aa54c33c474bd8dd4d271d56f5f039292c11c"} Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.135051 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-lg4p2" podUID="672d925c-cea5-49b4-ba73-3e1fb441f1d7" containerName="registry-server" containerID="cri-o://e577031694b1bb6c0957c2e4ec8aa54c33c474bd8dd4d271d56f5f039292c11c" gracePeriod=2 Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.186304 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fmzck" podStartSLOduration=2.131924953 podStartE2EDuration="2.186277287s" podCreationTimestamp="2025-10-01 11:42:12 +0000 UTC" firstStartedPulling="2025-10-01 11:42:13.42845239 +0000 UTC m=+824.528017367" lastFinishedPulling="2025-10-01 11:42:13.482804694 +0000 UTC m=+824.582369701" observedRunningTime="2025-10-01 11:42:14.159863654 +0000 UTC m=+825.259428671" watchObservedRunningTime="2025-10-01 11:42:14.186277287 +0000 UTC m=+825.285842284" Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.186994 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lg4p2" podStartSLOduration=2.335297634 podStartE2EDuration="6.186973085s" podCreationTimestamp="2025-10-01 11:42:08 +0000 UTC" firstStartedPulling="2025-10-01 11:42:09.198666521 +0000 UTC m=+820.298231498" lastFinishedPulling="2025-10-01 11:42:13.050341952 +0000 UTC m=+824.149906949" observedRunningTime="2025-10-01 11:42:14.185279972 +0000 UTC m=+825.284844969" watchObservedRunningTime="2025-10-01 11:42:14.186973085 +0000 UTC m=+825.286538072" Oct 01 11:42:14 crc kubenswrapper[4669]: I1001 11:42:14.220015 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4lq96" Oct 01 11:42:15 crc kubenswrapper[4669]: I1001 11:42:15.153170 4669 generic.go:334] "Generic (PLEG): container finished" podID="672d925c-cea5-49b4-ba73-3e1fb441f1d7" containerID="e577031694b1bb6c0957c2e4ec8aa54c33c474bd8dd4d271d56f5f039292c11c" exitCode=0 Oct 01 11:42:15 crc kubenswrapper[4669]: I1001 11:42:15.153249 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lg4p2" event={"ID":"672d925c-cea5-49b4-ba73-3e1fb441f1d7","Type":"ContainerDied","Data":"e577031694b1bb6c0957c2e4ec8aa54c33c474bd8dd4d271d56f5f039292c11c"} Oct 01 11:42:15 crc kubenswrapper[4669]: I1001 11:42:15.578492 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:15 crc kubenswrapper[4669]: I1001 11:42:15.748889 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srnhr\" (UniqueName: \"kubernetes.io/projected/672d925c-cea5-49b4-ba73-3e1fb441f1d7-kube-api-access-srnhr\") pod \"672d925c-cea5-49b4-ba73-3e1fb441f1d7\" (UID: \"672d925c-cea5-49b4-ba73-3e1fb441f1d7\") " Oct 01 11:42:15 crc kubenswrapper[4669]: I1001 11:42:15.756329 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672d925c-cea5-49b4-ba73-3e1fb441f1d7-kube-api-access-srnhr" (OuterVolumeSpecName: "kube-api-access-srnhr") pod "672d925c-cea5-49b4-ba73-3e1fb441f1d7" (UID: "672d925c-cea5-49b4-ba73-3e1fb441f1d7"). InnerVolumeSpecName "kube-api-access-srnhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:42:15 crc kubenswrapper[4669]: I1001 11:42:15.851335 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srnhr\" (UniqueName: \"kubernetes.io/projected/672d925c-cea5-49b4-ba73-3e1fb441f1d7-kube-api-access-srnhr\") on node \"crc\" DevicePath \"\"" Oct 01 11:42:16 crc kubenswrapper[4669]: I1001 11:42:16.165258 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lg4p2" event={"ID":"672d925c-cea5-49b4-ba73-3e1fb441f1d7","Type":"ContainerDied","Data":"bb1d115db33a87b56304372f576a58408297e73d9f65e429355b9db22411a695"} Oct 01 11:42:16 crc kubenswrapper[4669]: I1001 11:42:16.165385 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lg4p2" Oct 01 11:42:16 crc kubenswrapper[4669]: I1001 11:42:16.166289 4669 scope.go:117] "RemoveContainer" containerID="e577031694b1bb6c0957c2e4ec8aa54c33c474bd8dd4d271d56f5f039292c11c" Oct 01 11:42:16 crc kubenswrapper[4669]: I1001 11:42:16.215614 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lg4p2"] Oct 01 11:42:16 crc kubenswrapper[4669]: I1001 11:42:16.222408 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-lg4p2"] Oct 01 11:42:17 crc kubenswrapper[4669]: I1001 11:42:17.657298 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="672d925c-cea5-49b4-ba73-3e1fb441f1d7" path="/var/lib/kubelet/pods/672d925c-cea5-49b4-ba73-3e1fb441f1d7/volumes" Oct 01 11:42:22 crc kubenswrapper[4669]: I1001 11:42:22.737590 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:22 crc kubenswrapper[4669]: I1001 11:42:22.738247 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:22 crc kubenswrapper[4669]: I1001 11:42:22.779381 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:23 crc kubenswrapper[4669]: I1001 11:42:23.262444 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fmzck" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.444032 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t"] Oct 01 11:42:25 crc kubenswrapper[4669]: E1001 11:42:25.444433 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672d925c-cea5-49b4-ba73-3e1fb441f1d7" containerName="registry-server" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.444456 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="672d925c-cea5-49b4-ba73-3e1fb441f1d7" containerName="registry-server" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.444651 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="672d925c-cea5-49b4-ba73-3e1fb441f1d7" containerName="registry-server" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.445861 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.451054 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-558pz" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.461663 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t"] Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.513619 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-bundle\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.513848 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xdkr\" (UniqueName: \"kubernetes.io/projected/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-kube-api-access-8xdkr\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.513903 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-util\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.615238 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-bundle\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.615347 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xdkr\" (UniqueName: \"kubernetes.io/projected/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-kube-api-access-8xdkr\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.615381 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-util\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.616038 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-util\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.616250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-bundle\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.652521 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xdkr\" (UniqueName: \"kubernetes.io/projected/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-kube-api-access-8xdkr\") pod \"7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:25 crc kubenswrapper[4669]: I1001 11:42:25.765397 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:26 crc kubenswrapper[4669]: I1001 11:42:26.307467 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t"] Oct 01 11:42:26 crc kubenswrapper[4669]: W1001 11:42:26.318604 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4920edbb_5c89_4081_821f_5b7fcaa1bf9c.slice/crio-c843a28d587251e8de14a1940ce3daa76fd37f770ce6e6af2a9ea406162bc3e7 WatchSource:0}: Error finding container c843a28d587251e8de14a1940ce3daa76fd37f770ce6e6af2a9ea406162bc3e7: Status 404 returned error can't find the container with id c843a28d587251e8de14a1940ce3daa76fd37f770ce6e6af2a9ea406162bc3e7 Oct 01 11:42:27 crc kubenswrapper[4669]: I1001 11:42:27.256355 4669 generic.go:334] "Generic (PLEG): container finished" podID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerID="ef0867c70087d31a63b714d0a50b8752fbd9d6567132414714d5d3db890b39f5" exitCode=0 Oct 01 11:42:27 crc kubenswrapper[4669]: I1001 11:42:27.256486 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" event={"ID":"4920edbb-5c89-4081-821f-5b7fcaa1bf9c","Type":"ContainerDied","Data":"ef0867c70087d31a63b714d0a50b8752fbd9d6567132414714d5d3db890b39f5"} Oct 01 11:42:27 crc kubenswrapper[4669]: I1001 11:42:27.256852 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" event={"ID":"4920edbb-5c89-4081-821f-5b7fcaa1bf9c","Type":"ContainerStarted","Data":"c843a28d587251e8de14a1940ce3daa76fd37f770ce6e6af2a9ea406162bc3e7"} Oct 01 11:42:28 crc kubenswrapper[4669]: I1001 11:42:28.267912 4669 generic.go:334] "Generic (PLEG): container finished" podID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerID="9951d12ce42bad17dad1df38346a2b329f9296e1c0364d928a6061f1dac5869b" exitCode=0 Oct 01 11:42:28 crc kubenswrapper[4669]: I1001 11:42:28.268030 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" event={"ID":"4920edbb-5c89-4081-821f-5b7fcaa1bf9c","Type":"ContainerDied","Data":"9951d12ce42bad17dad1df38346a2b329f9296e1c0364d928a6061f1dac5869b"} Oct 01 11:42:29 crc kubenswrapper[4669]: I1001 11:42:29.284627 4669 generic.go:334] "Generic (PLEG): container finished" podID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerID="be1264bc4bd12dc6c1f35ece1c465d19f5ff415242ea2c13b5ecf22146d60c19" exitCode=0 Oct 01 11:42:29 crc kubenswrapper[4669]: I1001 11:42:29.284692 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" event={"ID":"4920edbb-5c89-4081-821f-5b7fcaa1bf9c","Type":"ContainerDied","Data":"be1264bc4bd12dc6c1f35ece1c465d19f5ff415242ea2c13b5ecf22146d60c19"} Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.672258 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.704565 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-bundle\") pod \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.704665 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xdkr\" (UniqueName: \"kubernetes.io/projected/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-kube-api-access-8xdkr\") pod \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.707472 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-bundle" (OuterVolumeSpecName: "bundle") pod "4920edbb-5c89-4081-821f-5b7fcaa1bf9c" (UID: "4920edbb-5c89-4081-821f-5b7fcaa1bf9c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.716621 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-kube-api-access-8xdkr" (OuterVolumeSpecName: "kube-api-access-8xdkr") pod "4920edbb-5c89-4081-821f-5b7fcaa1bf9c" (UID: "4920edbb-5c89-4081-821f-5b7fcaa1bf9c"). InnerVolumeSpecName "kube-api-access-8xdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.806485 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-util\") pod \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\" (UID: \"4920edbb-5c89-4081-821f-5b7fcaa1bf9c\") " Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.807193 4669 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.807292 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xdkr\" (UniqueName: \"kubernetes.io/projected/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-kube-api-access-8xdkr\") on node \"crc\" DevicePath \"\"" Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.844368 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-util" (OuterVolumeSpecName: "util") pod "4920edbb-5c89-4081-821f-5b7fcaa1bf9c" (UID: "4920edbb-5c89-4081-821f-5b7fcaa1bf9c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:42:30 crc kubenswrapper[4669]: I1001 11:42:30.908567 4669 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4920edbb-5c89-4081-821f-5b7fcaa1bf9c-util\") on node \"crc\" DevicePath \"\"" Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.303644 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" event={"ID":"4920edbb-5c89-4081-821f-5b7fcaa1bf9c","Type":"ContainerDied","Data":"c843a28d587251e8de14a1940ce3daa76fd37f770ce6e6af2a9ea406162bc3e7"} Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.303700 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c843a28d587251e8de14a1940ce3daa76fd37f770ce6e6af2a9ea406162bc3e7" Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.303785 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t" Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.864105 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.864210 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.864286 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.865417 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86579f99b2d7fdefab555c5926d95a0899a74cade0993be4e08705b39fe0421d"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:42:31 crc kubenswrapper[4669]: I1001 11:42:31.865560 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://86579f99b2d7fdefab555c5926d95a0899a74cade0993be4e08705b39fe0421d" gracePeriod=600 Oct 01 11:42:32 crc kubenswrapper[4669]: I1001 11:42:32.316689 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="86579f99b2d7fdefab555c5926d95a0899a74cade0993be4e08705b39fe0421d" exitCode=0 Oct 01 11:42:32 crc kubenswrapper[4669]: I1001 11:42:32.316815 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"86579f99b2d7fdefab555c5926d95a0899a74cade0993be4e08705b39fe0421d"} Oct 01 11:42:32 crc kubenswrapper[4669]: I1001 11:42:32.317395 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"ef1fa470dbb217bde08acd53a153a9e8382565310fe4c3c6cd2c78b6a193aa31"} Oct 01 11:42:32 crc kubenswrapper[4669]: I1001 11:42:32.317445 4669 scope.go:117] "RemoveContainer" containerID="749997bec659c722d86e3b88621cbc0e2b2ce7eed205c06ca6b7b63eaf908655" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.026957 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d"] Oct 01 11:42:38 crc kubenswrapper[4669]: E1001 11:42:38.029198 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="extract" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.029233 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="extract" Oct 01 11:42:38 crc kubenswrapper[4669]: E1001 11:42:38.029263 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="util" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.029278 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="util" Oct 01 11:42:38 crc kubenswrapper[4669]: E1001 11:42:38.029301 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="pull" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.029313 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="pull" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.029525 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4920edbb-5c89-4081-821f-5b7fcaa1bf9c" containerName="extract" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.030597 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.034415 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ft4rg" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.053330 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d"] Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.135869 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc22n\" (UniqueName: \"kubernetes.io/projected/7fe95054-218b-47f4-a729-95a7f6b45a3d-kube-api-access-mc22n\") pod \"openstack-operator-controller-operator-76995989df-q6c9d\" (UID: \"7fe95054-218b-47f4-a729-95a7f6b45a3d\") " pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.238041 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc22n\" (UniqueName: \"kubernetes.io/projected/7fe95054-218b-47f4-a729-95a7f6b45a3d-kube-api-access-mc22n\") pod \"openstack-operator-controller-operator-76995989df-q6c9d\" (UID: \"7fe95054-218b-47f4-a729-95a7f6b45a3d\") " pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.273047 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc22n\" (UniqueName: \"kubernetes.io/projected/7fe95054-218b-47f4-a729-95a7f6b45a3d-kube-api-access-mc22n\") pod \"openstack-operator-controller-operator-76995989df-q6c9d\" (UID: \"7fe95054-218b-47f4-a729-95a7f6b45a3d\") " pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.379166 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:38 crc kubenswrapper[4669]: I1001 11:42:38.864518 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d"] Oct 01 11:42:39 crc kubenswrapper[4669]: I1001 11:42:39.386743 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" event={"ID":"7fe95054-218b-47f4-a729-95a7f6b45a3d","Type":"ContainerStarted","Data":"922640813e95e5fff60b0aa30a15d6fa837a06fe513a66266b0bc34b5545bb2b"} Oct 01 11:42:44 crc kubenswrapper[4669]: I1001 11:42:44.451365 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" event={"ID":"7fe95054-218b-47f4-a729-95a7f6b45a3d","Type":"ContainerStarted","Data":"139a132060b2d68db18a5bd81683c4ac52808bb8d18fdfc9f507e3d38c614494"} Oct 01 11:42:47 crc kubenswrapper[4669]: I1001 11:42:47.480497 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" event={"ID":"7fe95054-218b-47f4-a729-95a7f6b45a3d","Type":"ContainerStarted","Data":"f14abe04fa136d6fb00ebc3bc3487983880dab0f439f18b0c89ef6c12f949235"} Oct 01 11:42:47 crc kubenswrapper[4669]: I1001 11:42:47.481393 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:47 crc kubenswrapper[4669]: I1001 11:42:47.528627 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" podStartSLOduration=2.689057255 podStartE2EDuration="10.528603178s" podCreationTimestamp="2025-10-01 11:42:37 +0000 UTC" firstStartedPulling="2025-10-01 11:42:38.871237685 +0000 UTC m=+849.970802672" lastFinishedPulling="2025-10-01 11:42:46.710783608 +0000 UTC m=+857.810348595" observedRunningTime="2025-10-01 11:42:47.524238591 +0000 UTC m=+858.623803568" watchObservedRunningTime="2025-10-01 11:42:47.528603178 +0000 UTC m=+858.628168175" Oct 01 11:42:48 crc kubenswrapper[4669]: I1001 11:42:48.383816 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-76995989df-q6c9d" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.527129 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pzbxh"] Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.528593 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.569316 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzbxh"] Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.640863 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-catalog-content\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.640923 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pth\" (UniqueName: \"kubernetes.io/projected/490d91dd-a901-4245-a79c-af03eda18c4d-kube-api-access-b7pth\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.640956 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-utilities\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.742595 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-catalog-content\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.742694 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pth\" (UniqueName: \"kubernetes.io/projected/490d91dd-a901-4245-a79c-af03eda18c4d-kube-api-access-b7pth\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.742760 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-utilities\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.743617 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-utilities\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.744135 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-catalog-content\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.779956 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pth\" (UniqueName: \"kubernetes.io/projected/490d91dd-a901-4245-a79c-af03eda18c4d-kube-api-access-b7pth\") pod \"redhat-marketplace-pzbxh\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:49 crc kubenswrapper[4669]: I1001 11:42:49.865303 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:50 crc kubenswrapper[4669]: I1001 11:42:50.428464 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzbxh"] Oct 01 11:42:50 crc kubenswrapper[4669]: W1001 11:42:50.437865 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490d91dd_a901_4245_a79c_af03eda18c4d.slice/crio-cba782ffe133d62678b2a76ecd14f2f39fb8af64ada26425abde153cc8e60eeb WatchSource:0}: Error finding container cba782ffe133d62678b2a76ecd14f2f39fb8af64ada26425abde153cc8e60eeb: Status 404 returned error can't find the container with id cba782ffe133d62678b2a76ecd14f2f39fb8af64ada26425abde153cc8e60eeb Oct 01 11:42:50 crc kubenswrapper[4669]: I1001 11:42:50.510446 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerStarted","Data":"cba782ffe133d62678b2a76ecd14f2f39fb8af64ada26425abde153cc8e60eeb"} Oct 01 11:42:51 crc kubenswrapper[4669]: I1001 11:42:51.518480 4669 generic.go:334] "Generic (PLEG): container finished" podID="490d91dd-a901-4245-a79c-af03eda18c4d" containerID="5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206" exitCode=0 Oct 01 11:42:51 crc kubenswrapper[4669]: I1001 11:42:51.518548 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerDied","Data":"5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206"} Oct 01 11:42:52 crc kubenswrapper[4669]: I1001 11:42:52.527753 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerStarted","Data":"d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e"} Oct 01 11:42:53 crc kubenswrapper[4669]: I1001 11:42:53.541217 4669 generic.go:334] "Generic (PLEG): container finished" podID="490d91dd-a901-4245-a79c-af03eda18c4d" containerID="d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e" exitCode=0 Oct 01 11:42:53 crc kubenswrapper[4669]: I1001 11:42:53.541372 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerDied","Data":"d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e"} Oct 01 11:42:54 crc kubenswrapper[4669]: I1001 11:42:54.552325 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerStarted","Data":"e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb"} Oct 01 11:42:54 crc kubenswrapper[4669]: I1001 11:42:54.585023 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pzbxh" podStartSLOduration=3.116221465 podStartE2EDuration="5.584990745s" podCreationTimestamp="2025-10-01 11:42:49 +0000 UTC" firstStartedPulling="2025-10-01 11:42:51.520823625 +0000 UTC m=+862.620388612" lastFinishedPulling="2025-10-01 11:42:53.989592885 +0000 UTC m=+865.089157892" observedRunningTime="2025-10-01 11:42:54.581215913 +0000 UTC m=+865.680780900" watchObservedRunningTime="2025-10-01 11:42:54.584990745 +0000 UTC m=+865.684555732" Oct 01 11:42:59 crc kubenswrapper[4669]: I1001 11:42:59.866019 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:59 crc kubenswrapper[4669]: I1001 11:42:59.866505 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:42:59 crc kubenswrapper[4669]: I1001 11:42:59.913858 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:43:00 crc kubenswrapper[4669]: I1001 11:43:00.688343 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:43:02 crc kubenswrapper[4669]: I1001 11:43:02.313145 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzbxh"] Oct 01 11:43:03 crc kubenswrapper[4669]: I1001 11:43:03.628730 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pzbxh" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="registry-server" containerID="cri-o://e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb" gracePeriod=2 Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.110011 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.185381 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-utilities\") pod \"490d91dd-a901-4245-a79c-af03eda18c4d\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.185424 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-catalog-content\") pod \"490d91dd-a901-4245-a79c-af03eda18c4d\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.186201 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-utilities" (OuterVolumeSpecName: "utilities") pod "490d91dd-a901-4245-a79c-af03eda18c4d" (UID: "490d91dd-a901-4245-a79c-af03eda18c4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.198839 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "490d91dd-a901-4245-a79c-af03eda18c4d" (UID: "490d91dd-a901-4245-a79c-af03eda18c4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.286473 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7pth\" (UniqueName: \"kubernetes.io/projected/490d91dd-a901-4245-a79c-af03eda18c4d-kube-api-access-b7pth\") pod \"490d91dd-a901-4245-a79c-af03eda18c4d\" (UID: \"490d91dd-a901-4245-a79c-af03eda18c4d\") " Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.286881 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.286901 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490d91dd-a901-4245-a79c-af03eda18c4d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.293183 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490d91dd-a901-4245-a79c-af03eda18c4d-kube-api-access-b7pth" (OuterVolumeSpecName: "kube-api-access-b7pth") pod "490d91dd-a901-4245-a79c-af03eda18c4d" (UID: "490d91dd-a901-4245-a79c-af03eda18c4d"). InnerVolumeSpecName "kube-api-access-b7pth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.387989 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7pth\" (UniqueName: \"kubernetes.io/projected/490d91dd-a901-4245-a79c-af03eda18c4d-kube-api-access-b7pth\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.638047 4669 generic.go:334] "Generic (PLEG): container finished" podID="490d91dd-a901-4245-a79c-af03eda18c4d" containerID="e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb" exitCode=0 Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.638134 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerDied","Data":"e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb"} Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.638247 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzbxh" event={"ID":"490d91dd-a901-4245-a79c-af03eda18c4d","Type":"ContainerDied","Data":"cba782ffe133d62678b2a76ecd14f2f39fb8af64ada26425abde153cc8e60eeb"} Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.638284 4669 scope.go:117] "RemoveContainer" containerID="e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.638162 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzbxh" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.665737 4669 scope.go:117] "RemoveContainer" containerID="d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.675567 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzbxh"] Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.688563 4669 scope.go:117] "RemoveContainer" containerID="5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.689322 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzbxh"] Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.712139 4669 scope.go:117] "RemoveContainer" containerID="e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb" Oct 01 11:43:04 crc kubenswrapper[4669]: E1001 11:43:04.712695 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb\": container with ID starting with e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb not found: ID does not exist" containerID="e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.712789 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb"} err="failed to get container status \"e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb\": rpc error: code = NotFound desc = could not find container \"e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb\": container with ID starting with e6f9b53cbf286ac38771a341ece8efb5927dd30ffee88540da5a4b4cff0fedfb not found: ID does not exist" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.712838 4669 scope.go:117] "RemoveContainer" containerID="d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e" Oct 01 11:43:04 crc kubenswrapper[4669]: E1001 11:43:04.713341 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e\": container with ID starting with d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e not found: ID does not exist" containerID="d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.713411 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e"} err="failed to get container status \"d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e\": rpc error: code = NotFound desc = could not find container \"d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e\": container with ID starting with d2a9464eaa6c870326d25f1491a0e4ac6ab4cf1c2ceb8a3b2b8116408c94c41e not found: ID does not exist" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.713454 4669 scope.go:117] "RemoveContainer" containerID="5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206" Oct 01 11:43:04 crc kubenswrapper[4669]: E1001 11:43:04.714256 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206\": container with ID starting with 5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206 not found: ID does not exist" containerID="5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206" Oct 01 11:43:04 crc kubenswrapper[4669]: I1001 11:43:04.714290 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206"} err="failed to get container status \"5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206\": rpc error: code = NotFound desc = could not find container \"5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206\": container with ID starting with 5c9e65851de99e33f8eae657769da9e5c79b33abf3e56dfd0a7d22236212e206 not found: ID does not exist" Oct 01 11:43:05 crc kubenswrapper[4669]: I1001 11:43:05.652359 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" path="/var/lib/kubelet/pods/490d91dd-a901-4245-a79c-af03eda18c4d/volumes" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.060068 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq"] Oct 01 11:43:06 crc kubenswrapper[4669]: E1001 11:43:06.060404 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="extract-content" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.060424 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="extract-content" Oct 01 11:43:06 crc kubenswrapper[4669]: E1001 11:43:06.060445 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="registry-server" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.060454 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="registry-server" Oct 01 11:43:06 crc kubenswrapper[4669]: E1001 11:43:06.060468 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="extract-utilities" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.060477 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="extract-utilities" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.060603 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="490d91dd-a901-4245-a79c-af03eda18c4d" containerName="registry-server" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.061447 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.063573 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4vc89" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.072151 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.073421 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.077880 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pnswb" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.085840 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.090697 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.104062 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.106130 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.109480 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zwffc" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.113222 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzd8v\" (UniqueName: \"kubernetes.io/projected/7f0d56cb-1002-4345-903e-7e5979f47978-kube-api-access-lzd8v\") pod \"cinder-operator-controller-manager-644bddb6d8-cz2dp\" (UID: \"7f0d56cb-1002-4345-903e-7e5979f47978\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.113280 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5jd\" (UniqueName: \"kubernetes.io/projected/aba4ff11-8110-4490-8a20-74c454be55d8-kube-api-access-mv5jd\") pod \"barbican-operator-controller-manager-6ff8b75857-9m4xq\" (UID: \"aba4ff11-8110-4490-8a20-74c454be55d8\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.113332 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qvnl\" (UniqueName: \"kubernetes.io/projected/4fa32a0a-904e-4b37-8ffb-a8c1d89df689-kube-api-access-2qvnl\") pod \"designate-operator-controller-manager-84f4f7b77b-xwt57\" (UID: \"4fa32a0a-904e-4b37-8ffb-a8c1d89df689\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.121058 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.127497 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.128746 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.131587 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-k6w72" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.157115 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.216590 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzd8v\" (UniqueName: \"kubernetes.io/projected/7f0d56cb-1002-4345-903e-7e5979f47978-kube-api-access-lzd8v\") pod \"cinder-operator-controller-manager-644bddb6d8-cz2dp\" (UID: \"7f0d56cb-1002-4345-903e-7e5979f47978\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.216651 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5jd\" (UniqueName: \"kubernetes.io/projected/aba4ff11-8110-4490-8a20-74c454be55d8-kube-api-access-mv5jd\") pod \"barbican-operator-controller-manager-6ff8b75857-9m4xq\" (UID: \"aba4ff11-8110-4490-8a20-74c454be55d8\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.216704 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qvnl\" (UniqueName: \"kubernetes.io/projected/4fa32a0a-904e-4b37-8ffb-a8c1d89df689-kube-api-access-2qvnl\") pod \"designate-operator-controller-manager-84f4f7b77b-xwt57\" (UID: \"4fa32a0a-904e-4b37-8ffb-a8c1d89df689\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.216740 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97hq\" (UniqueName: \"kubernetes.io/projected/e8163ded-d297-43ea-bde7-b5b90bdf1d17-kube-api-access-t97hq\") pod \"glance-operator-controller-manager-84958c4d49-dhqk6\" (UID: \"e8163ded-d297-43ea-bde7-b5b90bdf1d17\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.231031 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.232293 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.235527 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jv2rd" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.243165 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.244509 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.246279 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-l4p9z" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.256504 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5jd\" (UniqueName: \"kubernetes.io/projected/aba4ff11-8110-4490-8a20-74c454be55d8-kube-api-access-mv5jd\") pod \"barbican-operator-controller-manager-6ff8b75857-9m4xq\" (UID: \"aba4ff11-8110-4490-8a20-74c454be55d8\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.260819 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qvnl\" (UniqueName: \"kubernetes.io/projected/4fa32a0a-904e-4b37-8ffb-a8c1d89df689-kube-api-access-2qvnl\") pod \"designate-operator-controller-manager-84f4f7b77b-xwt57\" (UID: \"4fa32a0a-904e-4b37-8ffb-a8c1d89df689\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.261904 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzd8v\" (UniqueName: \"kubernetes.io/projected/7f0d56cb-1002-4345-903e-7e5979f47978-kube-api-access-lzd8v\") pod \"cinder-operator-controller-manager-644bddb6d8-cz2dp\" (UID: \"7f0d56cb-1002-4345-903e-7e5979f47978\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.262195 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.276922 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.278411 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.292519 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.292743 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bzbh7" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.305247 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.312781 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.318342 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphw8\" (UniqueName: \"kubernetes.io/projected/1265856e-7658-44ca-b0a9-a0a5a42b8f5d-kube-api-access-wphw8\") pod \"horizon-operator-controller-manager-9f4696d94-5fxfq\" (UID: \"1265856e-7658-44ca-b0a9-a0a5a42b8f5d\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.318433 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a887d629-1025-4da7-8c68-4b17c7205479-cert\") pod \"infra-operator-controller-manager-9d6c5db85-dv8s2\" (UID: \"a887d629-1025-4da7-8c68-4b17c7205479\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.318483 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlllt\" (UniqueName: \"kubernetes.io/projected/a887d629-1025-4da7-8c68-4b17c7205479-kube-api-access-tlllt\") pod \"infra-operator-controller-manager-9d6c5db85-dv8s2\" (UID: \"a887d629-1025-4da7-8c68-4b17c7205479\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.318524 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97hq\" (UniqueName: \"kubernetes.io/projected/e8163ded-d297-43ea-bde7-b5b90bdf1d17-kube-api-access-t97hq\") pod \"glance-operator-controller-manager-84958c4d49-dhqk6\" (UID: \"e8163ded-d297-43ea-bde7-b5b90bdf1d17\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.318551 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww56n\" (UniqueName: \"kubernetes.io/projected/ebc9c519-e267-43d1-93b7-4cf38c84cc66-kube-api-access-ww56n\") pod \"heat-operator-controller-manager-5d889d78cf-2z84q\" (UID: \"ebc9c519-e267-43d1-93b7-4cf38c84cc66\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.332301 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.333736 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.338868 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97hq\" (UniqueName: \"kubernetes.io/projected/e8163ded-d297-43ea-bde7-b5b90bdf1d17-kube-api-access-t97hq\") pod \"glance-operator-controller-manager-84958c4d49-dhqk6\" (UID: \"e8163ded-d297-43ea-bde7-b5b90bdf1d17\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.340437 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-s6ksw" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.353170 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.354425 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.358433 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-szj4z" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.382766 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.387320 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.400991 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.419381 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphw8\" (UniqueName: \"kubernetes.io/projected/1265856e-7658-44ca-b0a9-a0a5a42b8f5d-kube-api-access-wphw8\") pod \"horizon-operator-controller-manager-9f4696d94-5fxfq\" (UID: \"1265856e-7658-44ca-b0a9-a0a5a42b8f5d\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.419447 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a887d629-1025-4da7-8c68-4b17c7205479-cert\") pod \"infra-operator-controller-manager-9d6c5db85-dv8s2\" (UID: \"a887d629-1025-4da7-8c68-4b17c7205479\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.419482 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlllt\" (UniqueName: \"kubernetes.io/projected/a887d629-1025-4da7-8c68-4b17c7205479-kube-api-access-tlllt\") pod \"infra-operator-controller-manager-9d6c5db85-dv8s2\" (UID: \"a887d629-1025-4da7-8c68-4b17c7205479\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.419508 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww56n\" (UniqueName: \"kubernetes.io/projected/ebc9c519-e267-43d1-93b7-4cf38c84cc66-kube-api-access-ww56n\") pod \"heat-operator-controller-manager-5d889d78cf-2z84q\" (UID: \"ebc9c519-e267-43d1-93b7-4cf38c84cc66\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.419536 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55r5w\" (UniqueName: \"kubernetes.io/projected/400a027c-2dab-48e5-a109-e7b64d35807a-kube-api-access-55r5w\") pod \"keystone-operator-controller-manager-5bd55b4bff-j8t6g\" (UID: \"400a027c-2dab-48e5-a109-e7b64d35807a\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.419556 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfkp\" (UniqueName: \"kubernetes.io/projected/863b3375-804f-4c8b-ba14-01230d822604-kube-api-access-9qfkp\") pod \"ironic-operator-controller-manager-5cd4858477-lbq2b\" (UID: \"863b3375-804f-4c8b-ba14-01230d822604\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.420731 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.425062 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.433671 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a887d629-1025-4da7-8c68-4b17c7205479-cert\") pod \"infra-operator-controller-manager-9d6c5db85-dv8s2\" (UID: \"a887d629-1025-4da7-8c68-4b17c7205479\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.450694 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.451185 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.452937 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlllt\" (UniqueName: \"kubernetes.io/projected/a887d629-1025-4da7-8c68-4b17c7205479-kube-api-access-tlllt\") pod \"infra-operator-controller-manager-9d6c5db85-dv8s2\" (UID: \"a887d629-1025-4da7-8c68-4b17c7205479\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.455822 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphw8\" (UniqueName: \"kubernetes.io/projected/1265856e-7658-44ca-b0a9-a0a5a42b8f5d-kube-api-access-wphw8\") pod \"horizon-operator-controller-manager-9f4696d94-5fxfq\" (UID: \"1265856e-7658-44ca-b0a9-a0a5a42b8f5d\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.462513 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.465070 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4sjph" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.474053 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.500806 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww56n\" (UniqueName: \"kubernetes.io/projected/ebc9c519-e267-43d1-93b7-4cf38c84cc66-kube-api-access-ww56n\") pod \"heat-operator-controller-manager-5d889d78cf-2z84q\" (UID: \"ebc9c519-e267-43d1-93b7-4cf38c84cc66\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.526054 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55r5w\" (UniqueName: \"kubernetes.io/projected/400a027c-2dab-48e5-a109-e7b64d35807a-kube-api-access-55r5w\") pod \"keystone-operator-controller-manager-5bd55b4bff-j8t6g\" (UID: \"400a027c-2dab-48e5-a109-e7b64d35807a\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.526629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfkp\" (UniqueName: \"kubernetes.io/projected/863b3375-804f-4c8b-ba14-01230d822604-kube-api-access-9qfkp\") pod \"ironic-operator-controller-manager-5cd4858477-lbq2b\" (UID: \"863b3375-804f-4c8b-ba14-01230d822604\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.533316 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-fssdp"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.534628 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.541548 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wzpx4" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.554442 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-fssdp"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.574152 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfkp\" (UniqueName: \"kubernetes.io/projected/863b3375-804f-4c8b-ba14-01230d822604-kube-api-access-9qfkp\") pod \"ironic-operator-controller-manager-5cd4858477-lbq2b\" (UID: \"863b3375-804f-4c8b-ba14-01230d822604\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.575462 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.576821 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.581526 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hvlkx" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.584281 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55r5w\" (UniqueName: \"kubernetes.io/projected/400a027c-2dab-48e5-a109-e7b64d35807a-kube-api-access-55r5w\") pod \"keystone-operator-controller-manager-5bd55b4bff-j8t6g\" (UID: \"400a027c-2dab-48e5-a109-e7b64d35807a\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.597807 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.602207 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.614817 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.619564 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.619867 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.620700 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.623616 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.629644 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8gt2n" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.629765 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w96sd" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.630929 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbbm\" (UniqueName: \"kubernetes.io/projected/08897606-8ccd-4508-bf20-501855920e9e-kube-api-access-hdbbm\") pod \"mariadb-operator-controller-manager-88c7-fssdp\" (UID: \"08897606-8ccd-4508-bf20-501855920e9e\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.631115 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxfw\" (UniqueName: \"kubernetes.io/projected/99c2ea9b-bcc7-4933-9614-94c32861e93c-kube-api-access-rtxfw\") pod \"manila-operator-controller-manager-6d68dbc695-djnfk\" (UID: \"99c2ea9b-bcc7-4933-9614-94c32861e93c\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.642056 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.656600 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.664828 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.677597 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.680142 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.686272 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l6zdr" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.686552 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.699482 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.700643 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.705004 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.723121 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.726931 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.728041 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.731484 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gdd4b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.747556 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhbg\" (UniqueName: \"kubernetes.io/projected/707270cf-007e-4572-bae9-dd6b4c6e50d3-kube-api-access-hkhbg\") pod \"octavia-operator-controller-manager-7b787867f4-nr258\" (UID: \"707270cf-007e-4572-bae9-dd6b4c6e50d3\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.747900 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbbm\" (UniqueName: \"kubernetes.io/projected/08897606-8ccd-4508-bf20-501855920e9e-kube-api-access-hdbbm\") pod \"mariadb-operator-controller-manager-88c7-fssdp\" (UID: \"08897606-8ccd-4508-bf20-501855920e9e\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.748068 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqfwq\" (UniqueName: \"kubernetes.io/projected/da724701-02fc-439b-ba86-52bde8cb3003-kube-api-access-tqfwq\") pod \"neutron-operator-controller-manager-849d5b9b84-86p66\" (UID: \"da724701-02fc-439b-ba86-52bde8cb3003\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.748165 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxfw\" (UniqueName: \"kubernetes.io/projected/99c2ea9b-bcc7-4933-9614-94c32861e93c-kube-api-access-rtxfw\") pod \"manila-operator-controller-manager-6d68dbc695-djnfk\" (UID: \"99c2ea9b-bcc7-4933-9614-94c32861e93c\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.748306 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf456\" (UniqueName: \"kubernetes.io/projected/8783a088-91c6-4f3c-bc34-b3d5a805ea07-kube-api-access-pf456\") pod \"nova-operator-controller-manager-64cd67b5cb-7qf7f\" (UID: \"8783a088-91c6-4f3c-bc34-b3d5a805ea07\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.766513 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.826986 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbbm\" (UniqueName: \"kubernetes.io/projected/08897606-8ccd-4508-bf20-501855920e9e-kube-api-access-hdbbm\") pod \"mariadb-operator-controller-manager-88c7-fssdp\" (UID: \"08897606-8ccd-4508-bf20-501855920e9e\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.828233 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.839377 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.840250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxfw\" (UniqueName: \"kubernetes.io/projected/99c2ea9b-bcc7-4933-9614-94c32861e93c-kube-api-access-rtxfw\") pod \"manila-operator-controller-manager-6d68dbc695-djnfk\" (UID: \"99c2ea9b-bcc7-4933-9614-94c32861e93c\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.839414 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.845830 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jzhvb" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.867480 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.868467 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.868888 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqfwq\" (UniqueName: \"kubernetes.io/projected/da724701-02fc-439b-ba86-52bde8cb3003-kube-api-access-tqfwq\") pod \"neutron-operator-controller-manager-849d5b9b84-86p66\" (UID: \"da724701-02fc-439b-ba86-52bde8cb3003\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.868956 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqwz\" (UniqueName: \"kubernetes.io/projected/621748e9-0765-432f-bbc9-9bb62594eff6-kube-api-access-4qqwz\") pod \"ovn-operator-controller-manager-9976ff44c-5mbbk\" (UID: \"621748e9-0765-432f-bbc9-9bb62594eff6\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.869070 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf456\" (UniqueName: \"kubernetes.io/projected/8783a088-91c6-4f3c-bc34-b3d5a805ea07-kube-api-access-pf456\") pod \"nova-operator-controller-manager-64cd67b5cb-7qf7f\" (UID: \"8783a088-91c6-4f3c-bc34-b3d5a805ea07\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.869207 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thwmv\" (UniqueName: \"kubernetes.io/projected/91df1fb9-8c91-4dde-9317-ff09df368c49-kube-api-access-thwmv\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.869261 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhbg\" (UniqueName: \"kubernetes.io/projected/707270cf-007e-4572-bae9-dd6b4c6e50d3-kube-api-access-hkhbg\") pod \"octavia-operator-controller-manager-7b787867f4-nr258\" (UID: \"707270cf-007e-4572-bae9-dd6b4c6e50d3\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.869382 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91df1fb9-8c91-4dde-9317-ff09df368c49-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.870719 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.870851 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.875793 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sgml8" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.877581 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-vxwz5"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.879189 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.886141 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.891548 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-t995b" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.891769 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-24rgq" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.918371 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhbg\" (UniqueName: \"kubernetes.io/projected/707270cf-007e-4572-bae9-dd6b4c6e50d3-kube-api-access-hkhbg\") pod \"octavia-operator-controller-manager-7b787867f4-nr258\" (UID: \"707270cf-007e-4572-bae9-dd6b4c6e50d3\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.926167 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-vxwz5"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.930678 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqfwq\" (UniqueName: \"kubernetes.io/projected/da724701-02fc-439b-ba86-52bde8cb3003-kube-api-access-tqfwq\") pod \"neutron-operator-controller-manager-849d5b9b84-86p66\" (UID: \"da724701-02fc-439b-ba86-52bde8cb3003\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.944396 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll"] Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.957739 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972144 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thwmv\" (UniqueName: \"kubernetes.io/projected/91df1fb9-8c91-4dde-9317-ff09df368c49-kube-api-access-thwmv\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972262 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8rs\" (UniqueName: \"kubernetes.io/projected/4f573e37-cb0a-4eba-9477-7c3d71276c86-kube-api-access-sj8rs\") pod \"placement-operator-controller-manager-589c58c6c-bc8gx\" (UID: \"4f573e37-cb0a-4eba-9477-7c3d71276c86\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972288 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91df1fb9-8c91-4dde-9317-ff09df368c49-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972314 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rwx\" (UniqueName: \"kubernetes.io/projected/e24ede8f-da24-4161-8621-d8b5abd08c1f-kube-api-access-b5rwx\") pod \"swift-operator-controller-manager-84d6b4b759-2c2qw\" (UID: \"e24ede8f-da24-4161-8621-d8b5abd08c1f\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972344 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqwz\" (UniqueName: \"kubernetes.io/projected/621748e9-0765-432f-bbc9-9bb62594eff6-kube-api-access-4qqwz\") pod \"ovn-operator-controller-manager-9976ff44c-5mbbk\" (UID: \"621748e9-0765-432f-bbc9-9bb62594eff6\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972376 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8mgv\" (UniqueName: \"kubernetes.io/projected/a2282a94-4700-4aae-8572-2104962decf8-kube-api-access-f8mgv\") pod \"telemetry-operator-controller-manager-b8d54b5d7-p5qll\" (UID: \"a2282a94-4700-4aae-8572-2104962decf8\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.972430 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jszjd\" (UniqueName: \"kubernetes.io/projected/fb18dab5-d638-443a-bb62-6508de79bc0f-kube-api-access-jszjd\") pod \"test-operator-controller-manager-85777745bb-vxwz5\" (UID: \"fb18dab5-d638-443a-bb62-6508de79bc0f\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:06 crc kubenswrapper[4669]: E1001 11:43:06.972599 4669 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 11:43:06 crc kubenswrapper[4669]: E1001 11:43:06.972682 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91df1fb9-8c91-4dde-9317-ff09df368c49-cert podName:91df1fb9-8c91-4dde-9317-ff09df368c49 nodeName:}" failed. No retries permitted until 2025-10-01 11:43:07.472659177 +0000 UTC m=+878.572224154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91df1fb9-8c91-4dde-9317-ff09df368c49-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" (UID: "91df1fb9-8c91-4dde-9317-ff09df368c49") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.979785 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf456\" (UniqueName: \"kubernetes.io/projected/8783a088-91c6-4f3c-bc34-b3d5a805ea07-kube-api-access-pf456\") pod \"nova-operator-controller-manager-64cd67b5cb-7qf7f\" (UID: \"8783a088-91c6-4f3c-bc34-b3d5a805ea07\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:06 crc kubenswrapper[4669]: I1001 11:43:06.994386 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:06.999828 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.002496 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.009622 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-df9x7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.018163 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.022389 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thwmv\" (UniqueName: \"kubernetes.io/projected/91df1fb9-8c91-4dde-9317-ff09df368c49-kube-api-access-thwmv\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.035510 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqwz\" (UniqueName: \"kubernetes.io/projected/621748e9-0765-432f-bbc9-9bb62594eff6-kube-api-access-4qqwz\") pod \"ovn-operator-controller-manager-9976ff44c-5mbbk\" (UID: \"621748e9-0765-432f-bbc9-9bb62594eff6\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.063669 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.065427 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.068356 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7h6lc" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.071177 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.075337 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vcw\" (UniqueName: \"kubernetes.io/projected/6d2b6087-c54d-4138-b162-e024a7a0e842-kube-api-access-h7vcw\") pod \"watcher-operator-controller-manager-6b9957f54f-lgckz\" (UID: \"6d2b6087-c54d-4138-b162-e024a7a0e842\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.075405 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jszjd\" (UniqueName: \"kubernetes.io/projected/fb18dab5-d638-443a-bb62-6508de79bc0f-kube-api-access-jszjd\") pod \"test-operator-controller-manager-85777745bb-vxwz5\" (UID: \"fb18dab5-d638-443a-bb62-6508de79bc0f\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.075472 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8rs\" (UniqueName: \"kubernetes.io/projected/4f573e37-cb0a-4eba-9477-7c3d71276c86-kube-api-access-sj8rs\") pod \"placement-operator-controller-manager-589c58c6c-bc8gx\" (UID: \"4f573e37-cb0a-4eba-9477-7c3d71276c86\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.075512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rwx\" (UniqueName: \"kubernetes.io/projected/e24ede8f-da24-4161-8621-d8b5abd08c1f-kube-api-access-b5rwx\") pod \"swift-operator-controller-manager-84d6b4b759-2c2qw\" (UID: \"e24ede8f-da24-4161-8621-d8b5abd08c1f\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.075555 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mgv\" (UniqueName: \"kubernetes.io/projected/a2282a94-4700-4aae-8572-2104962decf8-kube-api-access-f8mgv\") pod \"telemetry-operator-controller-manager-b8d54b5d7-p5qll\" (UID: \"a2282a94-4700-4aae-8572-2104962decf8\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.077983 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.079779 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.093504 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.109771 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jszjd\" (UniqueName: \"kubernetes.io/projected/fb18dab5-d638-443a-bb62-6508de79bc0f-kube-api-access-jszjd\") pod \"test-operator-controller-manager-85777745bb-vxwz5\" (UID: \"fb18dab5-d638-443a-bb62-6508de79bc0f\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.109866 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.110942 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.111945 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mgv\" (UniqueName: \"kubernetes.io/projected/a2282a94-4700-4aae-8572-2104962decf8-kube-api-access-f8mgv\") pod \"telemetry-operator-controller-manager-b8d54b5d7-p5qll\" (UID: \"a2282a94-4700-4aae-8572-2104962decf8\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.121644 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.133189 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-klmmq" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.141898 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rwx\" (UniqueName: \"kubernetes.io/projected/e24ede8f-da24-4161-8621-d8b5abd08c1f-kube-api-access-b5rwx\") pod \"swift-operator-controller-manager-84d6b4b759-2c2qw\" (UID: \"e24ede8f-da24-4161-8621-d8b5abd08c1f\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.144406 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8rs\" (UniqueName: \"kubernetes.io/projected/4f573e37-cb0a-4eba-9477-7c3d71276c86-kube-api-access-sj8rs\") pod \"placement-operator-controller-manager-589c58c6c-bc8gx\" (UID: \"4f573e37-cb0a-4eba-9477-7c3d71276c86\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.150145 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.160155 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.178192 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vcw\" (UniqueName: \"kubernetes.io/projected/6d2b6087-c54d-4138-b162-e024a7a0e842-kube-api-access-h7vcw\") pod \"watcher-operator-controller-manager-6b9957f54f-lgckz\" (UID: \"6d2b6087-c54d-4138-b162-e024a7a0e842\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.178240 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8cf8\" (UniqueName: \"kubernetes.io/projected/964d3ab1-839a-49e6-b7c8-46056b070131-kube-api-access-w8cf8\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.178292 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jpb\" (UniqueName: \"kubernetes.io/projected/2545705c-a102-47ca-b42b-119670c5be57-kube-api-access-p5jpb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-szrsc\" (UID: \"2545705c-a102-47ca-b42b-119670c5be57\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.178336 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/964d3ab1-839a-49e6-b7c8-46056b070131-cert\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.211684 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.228707 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.263827 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.276259 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vcw\" (UniqueName: \"kubernetes.io/projected/6d2b6087-c54d-4138-b162-e024a7a0e842-kube-api-access-h7vcw\") pod \"watcher-operator-controller-manager-6b9957f54f-lgckz\" (UID: \"6d2b6087-c54d-4138-b162-e024a7a0e842\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.289695 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/964d3ab1-839a-49e6-b7c8-46056b070131-cert\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.289926 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8cf8\" (UniqueName: \"kubernetes.io/projected/964d3ab1-839a-49e6-b7c8-46056b070131-kube-api-access-w8cf8\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.290065 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jpb\" (UniqueName: \"kubernetes.io/projected/2545705c-a102-47ca-b42b-119670c5be57-kube-api-access-p5jpb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-szrsc\" (UID: \"2545705c-a102-47ca-b42b-119670c5be57\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" Oct 01 11:43:07 crc kubenswrapper[4669]: E1001 11:43:07.291001 4669 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 11:43:07 crc kubenswrapper[4669]: E1001 11:43:07.291096 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964d3ab1-839a-49e6-b7c8-46056b070131-cert podName:964d3ab1-839a-49e6-b7c8-46056b070131 nodeName:}" failed. No retries permitted until 2025-10-01 11:43:07.791058239 +0000 UTC m=+878.890623216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/964d3ab1-839a-49e6-b7c8-46056b070131-cert") pod "openstack-operator-controller-manager-6599487588-n9gx7" (UID: "964d3ab1-839a-49e6-b7c8-46056b070131") : secret "webhook-server-cert" not found Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.320743 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8cf8\" (UniqueName: \"kubernetes.io/projected/964d3ab1-839a-49e6-b7c8-46056b070131-kube-api-access-w8cf8\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.321037 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jpb\" (UniqueName: \"kubernetes.io/projected/2545705c-a102-47ca-b42b-119670c5be57-kube-api-access-p5jpb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-szrsc\" (UID: \"2545705c-a102-47ca-b42b-119670c5be57\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.335138 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.342477 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.396549 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.474364 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.494577 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91df1fb9-8c91-4dde-9317-ff09df368c49-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.502610 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91df1fb9-8c91-4dde-9317-ff09df368c49-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cfvgks\" (UID: \"91df1fb9-8c91-4dde-9317-ff09df368c49\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.627611 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.727595 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" event={"ID":"aba4ff11-8110-4490-8a20-74c454be55d8","Type":"ContainerStarted","Data":"d864a4a4ad8e60df6bcd51ce6ff0a412c983b6269fbd3c5e08004265913a1b51"} Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.750356 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" event={"ID":"7f0d56cb-1002-4345-903e-7e5979f47978","Type":"ContainerStarted","Data":"4534cf0fa1f9570adb301c165395b62acf3a38d088b565ae9491ab8150eb5995"} Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.758579 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57"] Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.808927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/964d3ab1-839a-49e6-b7c8-46056b070131-cert\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.826719 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/964d3ab1-839a-49e6-b7c8-46056b070131-cert\") pod \"openstack-operator-controller-manager-6599487588-n9gx7\" (UID: \"964d3ab1-839a-49e6-b7c8-46056b070131\") " pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:07 crc kubenswrapper[4669]: I1001 11:43:07.849781 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6"] Oct 01 11:43:07 crc kubenswrapper[4669]: W1001 11:43:07.875634 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8163ded_d297_43ea_bde7_b5b90bdf1d17.slice/crio-5957c1871e4cef6d3e8f9192add0ad7d29aed47b01d2b1ea1616db9a0aff9077 WatchSource:0}: Error finding container 5957c1871e4cef6d3e8f9192add0ad7d29aed47b01d2b1ea1616db9a0aff9077: Status 404 returned error can't find the container with id 5957c1871e4cef6d3e8f9192add0ad7d29aed47b01d2b1ea1616db9a0aff9077 Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.056875 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.449153 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.453202 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.461576 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.479310 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.486134 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.516181 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.522850 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw"] Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.543294 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24ede8f_da24_4161_8621_d8b5abd08c1f.slice/crio-1d06304d5d102c8c0611de651dbf7ed5a7885ce896209607cedf82b8059297b8 WatchSource:0}: Error finding container 1d06304d5d102c8c0611de651dbf7ed5a7885ce896209607cedf82b8059297b8: Status 404 returned error can't find the container with id 1d06304d5d102c8c0611de651dbf7ed5a7885ce896209607cedf82b8059297b8 Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.773617 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.779016 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.780999 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" event={"ID":"4f573e37-cb0a-4eba-9477-7c3d71276c86","Type":"ContainerStarted","Data":"d692ebbbc219878eab3a2e97143097c3a7a85c28f8f55e3e6c001558fa781939"} Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.783356 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707270cf_007e_4572_bae9_dd6b4c6e50d3.slice/crio-2c9435d9ab458f38669246a2f2730bf5fdfa71285a8194c581aed8f7c23ed94e WatchSource:0}: Error finding container 2c9435d9ab458f38669246a2f2730bf5fdfa71285a8194c581aed8f7c23ed94e: Status 404 returned error can't find the container with id 2c9435d9ab458f38669246a2f2730bf5fdfa71285a8194c581aed8f7c23ed94e Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.783774 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" event={"ID":"4fa32a0a-904e-4b37-8ffb-a8c1d89df689","Type":"ContainerStarted","Data":"10edf3856ea989c15063dfa1c3107b47fe1221dbef8d41fe133fd3fae15b4b97"} Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.786543 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-fssdp"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.786733 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" event={"ID":"1265856e-7658-44ca-b0a9-a0a5a42b8f5d","Type":"ContainerStarted","Data":"1a5408a652bda31d2cd0043b80094ad07b4c63da22ceac31ca33c5ef80fc98cc"} Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.789740 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" event={"ID":"e24ede8f-da24-4161-8621-d8b5abd08c1f","Type":"ContainerStarted","Data":"1d06304d5d102c8c0611de651dbf7ed5a7885ce896209607cedf82b8059297b8"} Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.791418 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" event={"ID":"e8163ded-d297-43ea-bde7-b5b90bdf1d17","Type":"ContainerStarted","Data":"5957c1871e4cef6d3e8f9192add0ad7d29aed47b01d2b1ea1616db9a0aff9077"} Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.792463 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.793316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" event={"ID":"8783a088-91c6-4f3c-bc34-b3d5a805ea07","Type":"ContainerStarted","Data":"a63314e654651ac308bf2a7b948339e2865424efe3f90442ec62c09e2849cd49"} Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.795393 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" event={"ID":"ebc9c519-e267-43d1-93b7-4cf38c84cc66","Type":"ContainerStarted","Data":"e2d9bc18c79e6004d68e99dee275c42f70d53f54bfaabd7e13ffb9d24b166194"} Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.796950 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" event={"ID":"99c2ea9b-bcc7-4933-9614-94c32861e93c","Type":"ContainerStarted","Data":"c8ff1213ca9ed3b637e2a467a404beab0d021b965f4e49d17fb171633e73c26e"} Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.797607 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08897606_8ccd_4508_bf20_501855920e9e.slice/crio-557705755de1ca491be846a334cb4e6e04d4833b8725464eacdb1b5ea4455c3d WatchSource:0}: Error finding container 557705755de1ca491be846a334cb4e6e04d4833b8725464eacdb1b5ea4455c3d: Status 404 returned error can't find the container with id 557705755de1ca491be846a334cb4e6e04d4833b8725464eacdb1b5ea4455c3d Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.798755 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b"] Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.799417 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400a027c_2dab_48e5_a109_e7b64d35807a.slice/crio-1256e66bd9820938e337ab879afa2abcc734bf66592b5aa9a357c7abeb1c27a5 WatchSource:0}: Error finding container 1256e66bd9820938e337ab879afa2abcc734bf66592b5aa9a357c7abeb1c27a5: Status 404 returned error can't find the container with id 1256e66bd9820938e337ab879afa2abcc734bf66592b5aa9a357c7abeb1c27a5 Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.806949 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" event={"ID":"da724701-02fc-439b-ba86-52bde8cb3003","Type":"ContainerStarted","Data":"be73a04dc74c203e6d22527e1264efbda7ab662d88948fb7262767325e2f9d9a"} Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.807099 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod621748e9_0765_432f_bbc9_9bb62594eff6.slice/crio-51a2c2e1878b3540228aaf3764b856c0cb988ffa65da7342792695d6d10c257b WatchSource:0}: Error finding container 51a2c2e1878b3540228aaf3764b856c0cb988ffa65da7342792695d6d10c257b: Status 404 returned error can't find the container with id 51a2c2e1878b3540228aaf3764b856c0cb988ffa65da7342792695d6d10c257b Oct 01 11:43:08 crc kubenswrapper[4669]: E1001 11:43:08.817509 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qqwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-5mbbk_openstack-operators(621748e9-0765-432f-bbc9-9bb62594eff6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.877824 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.891049 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-vxwz5"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.900987 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7"] Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.902955 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2545705c_a102_47ca_b42b_119670c5be57.slice/crio-ad983857d35fe54f7e7852c1f5cd12c6b09cf660663c5ace256f305380ef6471 WatchSource:0}: Error finding container ad983857d35fe54f7e7852c1f5cd12c6b09cf660663c5ace256f305380ef6471: Status 404 returned error can't find the container with id ad983857d35fe54f7e7852c1f5cd12c6b09cf660663c5ace256f305380ef6471 Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.906637 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.912522 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz"] Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.923790 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll"] Oct 01 11:43:08 crc kubenswrapper[4669]: E1001 11:43:08.928222 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h7vcw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9957f54f-lgckz_openstack-operators(6d2b6087-c54d-4138-b162-e024a7a0e842): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 11:43:08 crc kubenswrapper[4669]: I1001 11:43:08.943610 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2"] Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.951920 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964d3ab1_839a_49e6_b7c8_46056b070131.slice/crio-46c94a3aed2e291cf1c91499b4cdb15b80d3016acec866d4d23ba2a6fe9d98b0 WatchSource:0}: Error finding container 46c94a3aed2e291cf1c91499b4cdb15b80d3016acec866d4d23ba2a6fe9d98b0: Status 404 returned error can't find the container with id 46c94a3aed2e291cf1c91499b4cdb15b80d3016acec866d4d23ba2a6fe9d98b0 Oct 01 11:43:08 crc kubenswrapper[4669]: E1001 11:43:08.956994 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jszjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-vxwz5_openstack-operators(fb18dab5-d638-443a-bb62-6508de79bc0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.960897 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2282a94_4700_4aae_8572_2104962decf8.slice/crio-f385e4f98ee0864d7adb5ba2c0c68c8852baf16c81483834d8c220d2a05ced87 WatchSource:0}: Error finding container f385e4f98ee0864d7adb5ba2c0c68c8852baf16c81483834d8c220d2a05ced87: Status 404 returned error can't find the container with id f385e4f98ee0864d7adb5ba2c0c68c8852baf16c81483834d8c220d2a05ced87 Oct 01 11:43:08 crc kubenswrapper[4669]: W1001 11:43:08.971124 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91df1fb9_8c91_4dde_9317_ff09df368c49.slice/crio-cafe0bf1700437f3bf4d0e6d7e5be79318980e0830c27602c46e6bfef05cc922 WatchSource:0}: Error finding container cafe0bf1700437f3bf4d0e6d7e5be79318980e0830c27602c46e6bfef05cc922: Status 404 returned error can't find the container with id cafe0bf1700437f3bf4d0e6d7e5be79318980e0830c27602c46e6bfef05cc922 Oct 01 11:43:08 crc kubenswrapper[4669]: E1001 11:43:08.971832 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f8mgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-p5qll_openstack-operators(a2282a94-4700-4aae-8572-2104962decf8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 11:43:08 crc kubenswrapper[4669]: E1001 11:43:08.972506 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tlllt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-dv8s2_openstack-operators(a887d629-1025-4da7-8c68-4b17c7205479): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.003753 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thwmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8cfvgks_openstack-operators(91df1fb9-8c91-4dde-9317-ff09df368c49): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.017824 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" podUID="621748e9-0765-432f-bbc9-9bb62594eff6" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.167834 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" podUID="6d2b6087-c54d-4138-b162-e024a7a0e842" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.344614 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" podUID="fb18dab5-d638-443a-bb62-6508de79bc0f" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.509507 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" podUID="a887d629-1025-4da7-8c68-4b17c7205479" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.512822 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" podUID="a2282a94-4700-4aae-8572-2104962decf8" Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.549097 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" podUID="91df1fb9-8c91-4dde-9317-ff09df368c49" Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.846599 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" event={"ID":"707270cf-007e-4572-bae9-dd6b4c6e50d3","Type":"ContainerStarted","Data":"2c9435d9ab458f38669246a2f2730bf5fdfa71285a8194c581aed8f7c23ed94e"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.848829 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" event={"ID":"08897606-8ccd-4508-bf20-501855920e9e","Type":"ContainerStarted","Data":"557705755de1ca491be846a334cb4e6e04d4833b8725464eacdb1b5ea4455c3d"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.873256 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" event={"ID":"91df1fb9-8c91-4dde-9317-ff09df368c49","Type":"ContainerStarted","Data":"c7711a6429961bcf5d6387824dbc79cdf076d31521ec020996d818b9f38b5ada"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.873329 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" event={"ID":"91df1fb9-8c91-4dde-9317-ff09df368c49","Type":"ContainerStarted","Data":"cafe0bf1700437f3bf4d0e6d7e5be79318980e0830c27602c46e6bfef05cc922"} Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.878480 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" podUID="91df1fb9-8c91-4dde-9317-ff09df368c49" Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.895035 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" event={"ID":"964d3ab1-839a-49e6-b7c8-46056b070131","Type":"ContainerStarted","Data":"c0a15438da950c0c50854d0b95d4ba2aa5bbe711015445023b9add63e47bbec5"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.895129 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" event={"ID":"964d3ab1-839a-49e6-b7c8-46056b070131","Type":"ContainerStarted","Data":"d720143b0634cd5833a1860ea8950430aa02ddaa8e5758240377cef97e538ec3"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.895140 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" event={"ID":"964d3ab1-839a-49e6-b7c8-46056b070131","Type":"ContainerStarted","Data":"46c94a3aed2e291cf1c91499b4cdb15b80d3016acec866d4d23ba2a6fe9d98b0"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.895229 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.923102 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" event={"ID":"2545705c-a102-47ca-b42b-119670c5be57","Type":"ContainerStarted","Data":"ad983857d35fe54f7e7852c1f5cd12c6b09cf660663c5ace256f305380ef6471"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.980503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" event={"ID":"fb18dab5-d638-443a-bb62-6508de79bc0f","Type":"ContainerStarted","Data":"2ab95b1853ec46585d020046e1e71e58db34b34de12760ab729c865d7fdf298a"} Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.981208 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" event={"ID":"fb18dab5-d638-443a-bb62-6508de79bc0f","Type":"ContainerStarted","Data":"d138408ca3b582951fb5da89529230462dae80777748690dc390615370ddafee"} Oct 01 11:43:09 crc kubenswrapper[4669]: E1001 11:43:09.990371 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" podUID="fb18dab5-d638-443a-bb62-6508de79bc0f" Oct 01 11:43:09 crc kubenswrapper[4669]: I1001 11:43:09.997436 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" event={"ID":"863b3375-804f-4c8b-ba14-01230d822604","Type":"ContainerStarted","Data":"2f5e55ea49f54bd8777daf688b06603b283d0668e33568aa1cd6f6dfa0fdc706"} Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.011470 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" event={"ID":"6d2b6087-c54d-4138-b162-e024a7a0e842","Type":"ContainerStarted","Data":"5bc8ef4839cbd9f403a44141df3a842bd147309356b6ddb257130640dbeb430e"} Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.012570 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" event={"ID":"6d2b6087-c54d-4138-b162-e024a7a0e842","Type":"ContainerStarted","Data":"18dd55fe5ef1802575eabf26ad62ad6d5a9c13f87b608ee1700c70aced1149ca"} Oct 01 11:43:10 crc kubenswrapper[4669]: E1001 11:43:10.017367 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" podUID="6d2b6087-c54d-4138-b162-e024a7a0e842" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.025872 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" event={"ID":"a2282a94-4700-4aae-8572-2104962decf8","Type":"ContainerStarted","Data":"f12974663c38aaba37c6c8f1cd96b8567899fc2ae081ebf26e5eb679616e63fd"} Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.025924 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" event={"ID":"a2282a94-4700-4aae-8572-2104962decf8","Type":"ContainerStarted","Data":"f385e4f98ee0864d7adb5ba2c0c68c8852baf16c81483834d8c220d2a05ced87"} Oct 01 11:43:10 crc kubenswrapper[4669]: E1001 11:43:10.030393 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" podUID="a2282a94-4700-4aae-8572-2104962decf8" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.041147 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" event={"ID":"400a027c-2dab-48e5-a109-e7b64d35807a","Type":"ContainerStarted","Data":"1256e66bd9820938e337ab879afa2abcc734bf66592b5aa9a357c7abeb1c27a5"} Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.048099 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" event={"ID":"621748e9-0765-432f-bbc9-9bb62594eff6","Type":"ContainerStarted","Data":"df5602056341b0b03b596663ca9ae9ab19a95cc1f5b2c7c26cd4f936e3d70efe"} Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.048145 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" event={"ID":"621748e9-0765-432f-bbc9-9bb62594eff6","Type":"ContainerStarted","Data":"51a2c2e1878b3540228aaf3764b856c0cb988ffa65da7342792695d6d10c257b"} Oct 01 11:43:10 crc kubenswrapper[4669]: E1001 11:43:10.050028 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" podUID="621748e9-0765-432f-bbc9-9bb62594eff6" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.055626 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" event={"ID":"a887d629-1025-4da7-8c68-4b17c7205479","Type":"ContainerStarted","Data":"5c7c3986cc60ce47c0bbdaa4cedbed72f2358d0d8cb76cc9512bc88c021e603d"} Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.055690 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" event={"ID":"a887d629-1025-4da7-8c68-4b17c7205479","Type":"ContainerStarted","Data":"f1f60c5d6c45be1bcd0e27ef731092cbd78877ca294cfa403841018149304439"} Oct 01 11:43:10 crc kubenswrapper[4669]: E1001 11:43:10.105725 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" podUID="a887d629-1025-4da7-8c68-4b17c7205479" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.347584 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rdh8p"] Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.349947 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.412953 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdh8p"] Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.485355 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7h8j\" (UniqueName: \"kubernetes.io/projected/214eba80-e62b-41f4-8100-e63e9d8be3dd-kube-api-access-p7h8j\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.485436 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-catalog-content\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.485468 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-utilities\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.546488 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" podStartSLOduration=4.546458917 podStartE2EDuration="4.546458917s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:43:10.539604878 +0000 UTC m=+881.639169855" watchObservedRunningTime="2025-10-01 11:43:10.546458917 +0000 UTC m=+881.646023894" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.589009 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-utilities\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.589336 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7h8j\" (UniqueName: \"kubernetes.io/projected/214eba80-e62b-41f4-8100-e63e9d8be3dd-kube-api-access-p7h8j\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.589380 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-catalog-content\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.589925 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-catalog-content\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.590549 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-utilities\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.610161 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7h8j\" (UniqueName: \"kubernetes.io/projected/214eba80-e62b-41f4-8100-e63e9d8be3dd-kube-api-access-p7h8j\") pod \"certified-operators-rdh8p\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:10 crc kubenswrapper[4669]: I1001 11:43:10.702183 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:11 crc kubenswrapper[4669]: E1001 11:43:11.082661 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" podUID="fb18dab5-d638-443a-bb62-6508de79bc0f" Oct 01 11:43:11 crc kubenswrapper[4669]: E1001 11:43:11.083095 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" podUID="a887d629-1025-4da7-8c68-4b17c7205479" Oct 01 11:43:11 crc kubenswrapper[4669]: E1001 11:43:11.089460 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" podUID="6d2b6087-c54d-4138-b162-e024a7a0e842" Oct 01 11:43:11 crc kubenswrapper[4669]: E1001 11:43:11.089486 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" podUID="a2282a94-4700-4aae-8572-2104962decf8" Oct 01 11:43:11 crc kubenswrapper[4669]: E1001 11:43:11.089854 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" podUID="91df1fb9-8c91-4dde-9317-ff09df368c49" Oct 01 11:43:11 crc kubenswrapper[4669]: E1001 11:43:11.090022 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" podUID="621748e9-0765-432f-bbc9-9bb62594eff6" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.517500 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rfs5s"] Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.520629 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.529492 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfs5s"] Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.612405 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-catalog-content\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.612462 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27bww\" (UniqueName: \"kubernetes.io/projected/fa2b0226-7dae-4379-b54f-3650d7208784-kube-api-access-27bww\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.612479 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-utilities\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.714102 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-catalog-content\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.714155 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27bww\" (UniqueName: \"kubernetes.io/projected/fa2b0226-7dae-4379-b54f-3650d7208784-kube-api-access-27bww\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.714188 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-utilities\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.715489 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-utilities\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.715953 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-catalog-content\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.742134 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27bww\" (UniqueName: \"kubernetes.io/projected/fa2b0226-7dae-4379-b54f-3650d7208784-kube-api-access-27bww\") pod \"community-operators-rfs5s\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:11 crc kubenswrapper[4669]: I1001 11:43:11.899165 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.132232 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8vc2c"] Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.134816 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.165267 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8vc2c"] Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.186241 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-utilities\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.186331 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-catalog-content\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.186406 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5vjw\" (UniqueName: \"kubernetes.io/projected/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-kube-api-access-z5vjw\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.289949 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5vjw\" (UniqueName: \"kubernetes.io/projected/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-kube-api-access-z5vjw\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.290159 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-utilities\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.290209 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-catalog-content\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.290857 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-catalog-content\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.291048 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-utilities\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.315751 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5vjw\" (UniqueName: \"kubernetes.io/projected/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-kube-api-access-z5vjw\") pod \"redhat-operators-8vc2c\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:15 crc kubenswrapper[4669]: I1001 11:43:15.483881 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:18 crc kubenswrapper[4669]: I1001 11:43:18.065479 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6599487588-n9gx7" Oct 01 11:43:21 crc kubenswrapper[4669]: I1001 11:43:21.915909 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8vc2c"] Oct 01 11:43:22 crc kubenswrapper[4669]: W1001 11:43:22.046017 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049fd2a0_3cfd_4c63_a2e3_5dde72ebd969.slice/crio-de99bf54e60feb896dfe7fa14a2c5414b16db4f02ad30e51597623ce81f68217 WatchSource:0}: Error finding container de99bf54e60feb896dfe7fa14a2c5414b16db4f02ad30e51597623ce81f68217: Status 404 returned error can't find the container with id de99bf54e60feb896dfe7fa14a2c5414b16db4f02ad30e51597623ce81f68217 Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.244392 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerStarted","Data":"de99bf54e60feb896dfe7fa14a2c5414b16db4f02ad30e51597623ce81f68217"} Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.257854 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" event={"ID":"7f0d56cb-1002-4345-903e-7e5979f47978","Type":"ContainerStarted","Data":"1acc311828ccdaf7d3c1e68545c5330cdb7547965b47a463f393b3d7d65d82c2"} Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.262823 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" event={"ID":"e8163ded-d297-43ea-bde7-b5b90bdf1d17","Type":"ContainerStarted","Data":"2957280cc9590da8a172c2b8845936b45b413fa20202b6e2811d8daa64037a86"} Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.268445 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" event={"ID":"aba4ff11-8110-4490-8a20-74c454be55d8","Type":"ContainerStarted","Data":"677825236ff8bf1d543f62956e2935f3235ab35d8f149b6a526c8119c1b03c7e"} Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.279218 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdh8p"] Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.281729 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" event={"ID":"707270cf-007e-4572-bae9-dd6b4c6e50d3","Type":"ContainerStarted","Data":"33c7fb69e4c280ef121d7b9a6c3baf6f6ff64de30757df1e916436f913ff919e"} Oct 01 11:43:22 crc kubenswrapper[4669]: I1001 11:43:22.331342 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfs5s"] Oct 01 11:43:22 crc kubenswrapper[4669]: W1001 11:43:22.368129 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214eba80_e62b_41f4_8100_e63e9d8be3dd.slice/crio-ed751432bd287b204dbfc7ec68e78fe679db13b662d26db40d79061dfc40da37 WatchSource:0}: Error finding container ed751432bd287b204dbfc7ec68e78fe679db13b662d26db40d79061dfc40da37: Status 404 returned error can't find the container with id ed751432bd287b204dbfc7ec68e78fe679db13b662d26db40d79061dfc40da37 Oct 01 11:43:22 crc kubenswrapper[4669]: W1001 11:43:22.398263 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2b0226_7dae_4379_b54f_3650d7208784.slice/crio-060a993444e4dc0ae7965b021374a7c6ab69d9205dcf27ffb316db3db8d3577c WatchSource:0}: Error finding container 060a993444e4dc0ae7965b021374a7c6ab69d9205dcf27ffb316db3db8d3577c: Status 404 returned error can't find the container with id 060a993444e4dc0ae7965b021374a7c6ab69d9205dcf27ffb316db3db8d3577c Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.343751 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" event={"ID":"08897606-8ccd-4508-bf20-501855920e9e","Type":"ContainerStarted","Data":"a928f30a48e355f881053f1723285a8a440d3adfb28a7c479b333052d1493756"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.353801 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" event={"ID":"400a027c-2dab-48e5-a109-e7b64d35807a","Type":"ContainerStarted","Data":"974da181a6f27fa39b8116bc4f036f2634d7e4da6eb902b9eb105e2827fe119b"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.361621 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" event={"ID":"e24ede8f-da24-4161-8621-d8b5abd08c1f","Type":"ContainerStarted","Data":"a486f8cae86cda014914e74d6b0eb02aec719cf9bc86afec2f26596a395c4fe8"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.368472 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" event={"ID":"8783a088-91c6-4f3c-bc34-b3d5a805ea07","Type":"ContainerStarted","Data":"97e49737eb19e4cd6d7c433f5bba52f0f5eaff42c2a31f284c77de92c59d86ec"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.378531 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" event={"ID":"2545705c-a102-47ca-b42b-119670c5be57","Type":"ContainerStarted","Data":"073edf84a2e5cbc330a5c62f10b4718b470d999251bbed1bb51140394018f318"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.397431 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" event={"ID":"863b3375-804f-4c8b-ba14-01230d822604","Type":"ContainerStarted","Data":"abf57f44eb3841911bc22291d6013c6d4cb2826bc2524cb5910a6f76c2578e0b"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.410711 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-szrsc" podStartSLOduration=4.575563278 podStartE2EDuration="17.410688852s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.905901566 +0000 UTC m=+880.005466543" lastFinishedPulling="2025-10-01 11:43:21.74102715 +0000 UTC m=+892.840592117" observedRunningTime="2025-10-01 11:43:23.40615003 +0000 UTC m=+894.505715007" watchObservedRunningTime="2025-10-01 11:43:23.410688852 +0000 UTC m=+894.510253829" Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.436195 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" event={"ID":"4fa32a0a-904e-4b37-8ffb-a8c1d89df689","Type":"ContainerStarted","Data":"a59032aa1d48c85190544258c5509d32de43d6b087f6421c334c5cd1b0b42a6f"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.442721 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfs5s" event={"ID":"fa2b0226-7dae-4379-b54f-3650d7208784","Type":"ContainerStarted","Data":"060a993444e4dc0ae7965b021374a7c6ab69d9205dcf27ffb316db3db8d3577c"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.457273 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" event={"ID":"da724701-02fc-439b-ba86-52bde8cb3003","Type":"ContainerStarted","Data":"bbb82728974a2ddc5ee96c24b631910d47ec28a9cd97dea403d95bb88c4cd7c3"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.467216 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" event={"ID":"99c2ea9b-bcc7-4933-9614-94c32861e93c","Type":"ContainerStarted","Data":"48f9f2bbc494acb7841b62786a0b4ad00e70bb56c71ab00998854c91a4124bb6"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.479234 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdh8p" event={"ID":"214eba80-e62b-41f4-8100-e63e9d8be3dd","Type":"ContainerStarted","Data":"ed751432bd287b204dbfc7ec68e78fe679db13b662d26db40d79061dfc40da37"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.494223 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" event={"ID":"1265856e-7658-44ca-b0a9-a0a5a42b8f5d","Type":"ContainerStarted","Data":"c5e112d911738dfe7f36d5ffd0ef999558ba93d7b78ab7aa4cfa9e1eb1593adf"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.500437 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" event={"ID":"ebc9c519-e267-43d1-93b7-4cf38c84cc66","Type":"ContainerStarted","Data":"ed9616bf67d57b051df6f4ffba5ceb40ea3ae26a87ee0c13137c33856956b106"} Oct 01 11:43:23 crc kubenswrapper[4669]: I1001 11:43:23.504782 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" event={"ID":"4f573e37-cb0a-4eba-9477-7c3d71276c86","Type":"ContainerStarted","Data":"aac027727097d6dbdba4c0e113e32834c07fe03f7c3a3cadab9d479aa8916482"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.517740 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" event={"ID":"da724701-02fc-439b-ba86-52bde8cb3003","Type":"ContainerStarted","Data":"f8ec8b466fda76faf01aeb410c1e81f4edaf4badd0afdc1e1a8d4111bdeb5a0a"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.518149 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.522413 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" event={"ID":"aba4ff11-8110-4490-8a20-74c454be55d8","Type":"ContainerStarted","Data":"b6189d6374bbe3ed485f48a5147601acb8d782d20e6b11ddd48c54e303b349d2"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.522586 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.533273 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" event={"ID":"400a027c-2dab-48e5-a109-e7b64d35807a","Type":"ContainerStarted","Data":"a1c6a20aa2bf0dec156e1902849535876bffd9cd2701225b7f774c6ec883b2d1"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.534181 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.546637 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" event={"ID":"4fa32a0a-904e-4b37-8ffb-a8c1d89df689","Type":"ContainerStarted","Data":"51915e82bafe921cd8e223c5ff88a1a91c1d705251591fa5ce0e4a1b9989fee0"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.547034 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.554432 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" event={"ID":"1265856e-7658-44ca-b0a9-a0a5a42b8f5d","Type":"ContainerStarted","Data":"2a21288d9d0aeb73292fee0707461c1fe52960a76f9c30fdf4a4450bcc7dfc94"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.554884 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.560367 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" podStartSLOduration=5.256066792 podStartE2EDuration="18.560341976s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.475025782 +0000 UTC m=+879.574590759" lastFinishedPulling="2025-10-01 11:43:21.779300956 +0000 UTC m=+892.878865943" observedRunningTime="2025-10-01 11:43:24.557282971 +0000 UTC m=+895.656847958" watchObservedRunningTime="2025-10-01 11:43:24.560341976 +0000 UTC m=+895.659906963" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.560451 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" event={"ID":"e8163ded-d297-43ea-bde7-b5b90bdf1d17","Type":"ContainerStarted","Data":"92b539a87f138e51265957b1f780f566b6e15e8d7f696051c01c4e780165968b"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.560872 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.565004 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" event={"ID":"863b3375-804f-4c8b-ba14-01230d822604","Type":"ContainerStarted","Data":"08d566b482796dc35c456b4894fed013316735b6cc1c15b1b19825e2cf9c2b5b"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.565489 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.566724 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" event={"ID":"7f0d56cb-1002-4345-903e-7e5979f47978","Type":"ContainerStarted","Data":"b8619cafdf084f2c37b015a06ba3aa1bd2ba43442b7beb021f84ac159f6c73d1"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.567156 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.568668 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" event={"ID":"8783a088-91c6-4f3c-bc34-b3d5a805ea07","Type":"ContainerStarted","Data":"45ee35c1846931adb45a308ae4c445dce5251bda5643202025208bc492f27185"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.569049 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.576726 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" podStartSLOduration=4.710700057 podStartE2EDuration="18.576714331s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:07.874962885 +0000 UTC m=+878.974527862" lastFinishedPulling="2025-10-01 11:43:21.740977159 +0000 UTC m=+892.840542136" observedRunningTime="2025-10-01 11:43:24.574874006 +0000 UTC m=+895.674438993" watchObservedRunningTime="2025-10-01 11:43:24.576714331 +0000 UTC m=+895.676279308" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.585280 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" event={"ID":"707270cf-007e-4572-bae9-dd6b4c6e50d3","Type":"ContainerStarted","Data":"21384ceed94eb8a09badac14285091ab0bffc9fdfa819180f33ca635663a6583"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.585896 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.590352 4669 generic.go:334] "Generic (PLEG): container finished" podID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerID="ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b" exitCode=0 Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.590417 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdh8p" event={"ID":"214eba80-e62b-41f4-8100-e63e9d8be3dd","Type":"ContainerDied","Data":"ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.594537 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" podStartSLOduration=5.652777631 podStartE2EDuration="18.594528102s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.810788194 +0000 UTC m=+879.910353171" lastFinishedPulling="2025-10-01 11:43:21.752538665 +0000 UTC m=+892.852103642" observedRunningTime="2025-10-01 11:43:24.592313577 +0000 UTC m=+895.691878554" watchObservedRunningTime="2025-10-01 11:43:24.594528102 +0000 UTC m=+895.694093079" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.596923 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" event={"ID":"4f573e37-cb0a-4eba-9477-7c3d71276c86","Type":"ContainerStarted","Data":"7efbb398ec5627bca183f8c98083bf922f2faf9465b267d3ba59349960fd5a42"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.597544 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.606593 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" event={"ID":"99c2ea9b-bcc7-4933-9614-94c32861e93c","Type":"ContainerStarted","Data":"13540d18d8fefa8576605af2bbc0e3787e5f24a2e2ce0beea5e97c39a9f77275"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.606659 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.614102 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" event={"ID":"e24ede8f-da24-4161-8621-d8b5abd08c1f","Type":"ContainerStarted","Data":"535dc1e1294bd89b17b2f92a8a15fdcf0169a6e33f952e3654ce252a475e3ed0"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.614331 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.619905 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" event={"ID":"08897606-8ccd-4508-bf20-501855920e9e","Type":"ContainerStarted","Data":"b326399da48ce5d46a67bc838440961b902028a3cb18cd9f6dde415bccd7fd0e"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.620025 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.619894 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" podStartSLOduration=5.322077406 podStartE2EDuration="18.619882859s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.476296234 +0000 UTC m=+879.575861211" lastFinishedPulling="2025-10-01 11:43:21.774101667 +0000 UTC m=+892.873666664" observedRunningTime="2025-10-01 11:43:24.618949035 +0000 UTC m=+895.718514012" watchObservedRunningTime="2025-10-01 11:43:24.619882859 +0000 UTC m=+895.719447836" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.627934 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" event={"ID":"ebc9c519-e267-43d1-93b7-4cf38c84cc66","Type":"ContainerStarted","Data":"d4fb365645340c62a8632ea90d075d09c10fa873d8c0696078f4cb7ccb71df74"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.630310 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.636586 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa2b0226-7dae-4379-b54f-3650d7208784" containerID="13d5f8920c9e231207cd43b86f963c417c2c836fbc99bebf9e2dee24ba209dde" exitCode=0 Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.636646 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfs5s" event={"ID":"fa2b0226-7dae-4379-b54f-3650d7208784","Type":"ContainerDied","Data":"13d5f8920c9e231207cd43b86f963c417c2c836fbc99bebf9e2dee24ba209dde"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.651322 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" podStartSLOduration=4.169918118 podStartE2EDuration="18.651298805s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:07.288827884 +0000 UTC m=+878.388392861" lastFinishedPulling="2025-10-01 11:43:21.770208551 +0000 UTC m=+892.869773548" observedRunningTime="2025-10-01 11:43:24.649494831 +0000 UTC m=+895.749059808" watchObservedRunningTime="2025-10-01 11:43:24.651298805 +0000 UTC m=+895.750863782" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.658773 4669 generic.go:334] "Generic (PLEG): container finished" podID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerID="84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b" exitCode=0 Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.660047 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerDied","Data":"84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b"} Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.688173 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" podStartSLOduration=4.317738031 podStartE2EDuration="18.688150226s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:07.280166299 +0000 UTC m=+878.379731276" lastFinishedPulling="2025-10-01 11:43:21.650578484 +0000 UTC m=+892.750143471" observedRunningTime="2025-10-01 11:43:24.672065619 +0000 UTC m=+895.771630596" watchObservedRunningTime="2025-10-01 11:43:24.688150226 +0000 UTC m=+895.787715193" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.705342 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" podStartSLOduration=5.76802392 podStartE2EDuration="18.705324931s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.803723259 +0000 UTC m=+879.903288236" lastFinishedPulling="2025-10-01 11:43:21.74102427 +0000 UTC m=+892.840589247" observedRunningTime="2025-10-01 11:43:24.70166523 +0000 UTC m=+895.801230207" watchObservedRunningTime="2025-10-01 11:43:24.705324931 +0000 UTC m=+895.804889898" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.727240 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" podStartSLOduration=5.570950407 podStartE2EDuration="18.727197261s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.564633668 +0000 UTC m=+879.664198645" lastFinishedPulling="2025-10-01 11:43:21.720880522 +0000 UTC m=+892.820445499" observedRunningTime="2025-10-01 11:43:24.726515145 +0000 UTC m=+895.826080122" watchObservedRunningTime="2025-10-01 11:43:24.727197261 +0000 UTC m=+895.826762238" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.787522 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" podStartSLOduration=5.56543375 podStartE2EDuration="18.787492352s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.546903589 +0000 UTC m=+879.646468566" lastFinishedPulling="2025-10-01 11:43:21.768962201 +0000 UTC m=+892.868527168" observedRunningTime="2025-10-01 11:43:24.776293485 +0000 UTC m=+895.875858462" watchObservedRunningTime="2025-10-01 11:43:24.787492352 +0000 UTC m=+895.887057329" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.807187 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" podStartSLOduration=5.87358336 podStartE2EDuration="18.807156098s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.787342284 +0000 UTC m=+879.886907261" lastFinishedPulling="2025-10-01 11:43:21.720915012 +0000 UTC m=+892.820479999" observedRunningTime="2025-10-01 11:43:24.798945466 +0000 UTC m=+895.898510443" watchObservedRunningTime="2025-10-01 11:43:24.807156098 +0000 UTC m=+895.906721075" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.819861 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" podStartSLOduration=5.595027633 podStartE2EDuration="18.819838773s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.527688814 +0000 UTC m=+879.627253791" lastFinishedPulling="2025-10-01 11:43:21.752499954 +0000 UTC m=+892.852064931" observedRunningTime="2025-10-01 11:43:24.815173877 +0000 UTC m=+895.914738854" watchObservedRunningTime="2025-10-01 11:43:24.819838773 +0000 UTC m=+895.919403750" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.844435 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" podStartSLOduration=5.65036604 podStartE2EDuration="18.84440325s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.529051997 +0000 UTC m=+879.628616974" lastFinishedPulling="2025-10-01 11:43:21.723089207 +0000 UTC m=+892.822654184" observedRunningTime="2025-10-01 11:43:24.835064958 +0000 UTC m=+895.934629945" watchObservedRunningTime="2025-10-01 11:43:24.84440325 +0000 UTC m=+895.943968237" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.860566 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" podStartSLOduration=5.009363923 podStartE2EDuration="18.860543829s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:07.91113261 +0000 UTC m=+879.010697587" lastFinishedPulling="2025-10-01 11:43:21.762312496 +0000 UTC m=+892.861877493" observedRunningTime="2025-10-01 11:43:24.857463382 +0000 UTC m=+895.957028359" watchObservedRunningTime="2025-10-01 11:43:24.860543829 +0000 UTC m=+895.960108806" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.882857 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" podStartSLOduration=5.642579859 podStartE2EDuration="18.88282944s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.537776524 +0000 UTC m=+879.637341501" lastFinishedPulling="2025-10-01 11:43:21.778026105 +0000 UTC m=+892.877591082" observedRunningTime="2025-10-01 11:43:24.879737253 +0000 UTC m=+895.979302220" watchObservedRunningTime="2025-10-01 11:43:24.88282944 +0000 UTC m=+895.982394427" Oct 01 11:43:24 crc kubenswrapper[4669]: I1001 11:43:24.917362 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" podStartSLOduration=5.973405578 podStartE2EDuration="18.917339133s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.797096095 +0000 UTC m=+879.896661072" lastFinishedPulling="2025-10-01 11:43:21.74102963 +0000 UTC m=+892.840594627" observedRunningTime="2025-10-01 11:43:24.915859336 +0000 UTC m=+896.015424333" watchObservedRunningTime="2025-10-01 11:43:24.917339133 +0000 UTC m=+896.016904100" Oct 01 11:43:27 crc kubenswrapper[4669]: I1001 11:43:27.234489 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bc8gx" Oct 01 11:43:27 crc kubenswrapper[4669]: I1001 11:43:27.267616 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-2c2qw" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.721946 4669 generic.go:334] "Generic (PLEG): container finished" podID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerID="c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b" exitCode=0 Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.722099 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdh8p" event={"ID":"214eba80-e62b-41f4-8100-e63e9d8be3dd","Type":"ContainerDied","Data":"c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.725948 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerStarted","Data":"fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.731209 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" event={"ID":"91df1fb9-8c91-4dde-9317-ff09df368c49","Type":"ContainerStarted","Data":"15f4dfd9d859ecdc1d4ba47db45866399b63853eb71157adbfae7a41b18d6042"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.731464 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.733958 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" event={"ID":"621748e9-0765-432f-bbc9-9bb62594eff6","Type":"ContainerStarted","Data":"e5eda6560be813b28ef21a9941a04cacfff2c6321469d70b60ab9823e3ec1327"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.734262 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.736284 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa2b0226-7dae-4379-b54f-3650d7208784" containerID="6776bcc87e8bbb03de288e0818804a613313a6da625b2e056e871a199a20fba2" exitCode=0 Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.736355 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfs5s" event={"ID":"fa2b0226-7dae-4379-b54f-3650d7208784","Type":"ContainerDied","Data":"6776bcc87e8bbb03de288e0818804a613313a6da625b2e056e871a199a20fba2"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.741647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" event={"ID":"6d2b6087-c54d-4138-b162-e024a7a0e842","Type":"ContainerStarted","Data":"431d42c755fddcfd0764de6ec7177c15b0b752ab240f550bbac77d79d0a78a60"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.745044 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" event={"ID":"fb18dab5-d638-443a-bb62-6508de79bc0f","Type":"ContainerStarted","Data":"b0708bbf893cb6ec5fbc69df2ce8779c49db248f0d12407eb277990914badca6"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.745762 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.752320 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" event={"ID":"a887d629-1025-4da7-8c68-4b17c7205479","Type":"ContainerStarted","Data":"d1a75119c13bfd0b453923f3793cf50322966620e411c051030e82127df07a82"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.753333 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.764175 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" event={"ID":"a2282a94-4700-4aae-8572-2104962decf8","Type":"ContainerStarted","Data":"fc0b326ca5b3b3408f5a00900970024bbb8a5af4dd6cfb595afb435beef11b1e"} Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.765006 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.783521 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" podStartSLOduration=3.9323614239999998 podStartE2EDuration="24.783501502s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.817221682 +0000 UTC m=+879.916786659" lastFinishedPulling="2025-10-01 11:43:29.66836172 +0000 UTC m=+900.767926737" observedRunningTime="2025-10-01 11:43:30.783273276 +0000 UTC m=+901.882838263" watchObservedRunningTime="2025-10-01 11:43:30.783501502 +0000 UTC m=+901.883066479" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.800752 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" podStartSLOduration=5.111027156 podStartE2EDuration="24.800729518s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.956861055 +0000 UTC m=+880.056426032" lastFinishedPulling="2025-10-01 11:43:28.646563407 +0000 UTC m=+899.746128394" observedRunningTime="2025-10-01 11:43:30.797528629 +0000 UTC m=+901.897093606" watchObservedRunningTime="2025-10-01 11:43:30.800729518 +0000 UTC m=+901.900294505" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.836568 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" podStartSLOduration=5.117670191 podStartE2EDuration="24.836545334s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.927695644 +0000 UTC m=+880.027260621" lastFinishedPulling="2025-10-01 11:43:28.646570787 +0000 UTC m=+899.746135764" observedRunningTime="2025-10-01 11:43:30.83277731 +0000 UTC m=+901.932342287" watchObservedRunningTime="2025-10-01 11:43:30.836545334 +0000 UTC m=+901.936110311" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.887800 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" podStartSLOduration=4.054231267 podStartE2EDuration="24.88777871s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:09.003231132 +0000 UTC m=+880.102796109" lastFinishedPulling="2025-10-01 11:43:29.836778575 +0000 UTC m=+900.936343552" observedRunningTime="2025-10-01 11:43:30.86675706 +0000 UTC m=+901.966322047" watchObservedRunningTime="2025-10-01 11:43:30.88777871 +0000 UTC m=+901.987343687" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.907519 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" podStartSLOduration=5.75834762 podStartE2EDuration="24.907494977s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.972397159 +0000 UTC m=+880.071962136" lastFinishedPulling="2025-10-01 11:43:28.121544506 +0000 UTC m=+899.221109493" observedRunningTime="2025-10-01 11:43:30.90196819 +0000 UTC m=+902.001533187" watchObservedRunningTime="2025-10-01 11:43:30.907494977 +0000 UTC m=+902.007059954" Oct 01 11:43:30 crc kubenswrapper[4669]: I1001 11:43:30.921275 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" podStartSLOduration=5.246308101 podStartE2EDuration="24.921246397s" podCreationTimestamp="2025-10-01 11:43:06 +0000 UTC" firstStartedPulling="2025-10-01 11:43:08.971629971 +0000 UTC m=+880.071194948" lastFinishedPulling="2025-10-01 11:43:28.646568267 +0000 UTC m=+899.746133244" observedRunningTime="2025-10-01 11:43:30.917911595 +0000 UTC m=+902.017476582" watchObservedRunningTime="2025-10-01 11:43:30.921246397 +0000 UTC m=+902.020811374" Oct 01 11:43:31 crc kubenswrapper[4669]: I1001 11:43:31.775021 4669 generic.go:334] "Generic (PLEG): container finished" podID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerID="fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11" exitCode=0 Oct 01 11:43:31 crc kubenswrapper[4669]: I1001 11:43:31.775129 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerDied","Data":"fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11"} Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.391738 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-9m4xq" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.409044 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-cz2dp" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.431723 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xwt57" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.457787 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-dhqk6" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.601729 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-2z84q" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.626846 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-5fxfq" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.666366 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-dv8s2" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.703704 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-lbq2b" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.712869 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-j8t6g" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.890670 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-fssdp" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.961043 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-nr258" Oct 01 11:43:36 crc kubenswrapper[4669]: I1001 11:43:36.998223 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-7qf7f" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.083294 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-5mbbk" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.096783 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-djnfk" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.215232 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-86p66" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.338666 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-p5qll" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.344686 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-vxwz5" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.397307 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.400700 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-lgckz" Oct 01 11:43:37 crc kubenswrapper[4669]: I1001 11:43:37.637267 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cfvgks" Oct 01 11:43:38 crc kubenswrapper[4669]: I1001 11:43:38.843323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfs5s" event={"ID":"fa2b0226-7dae-4379-b54f-3650d7208784","Type":"ContainerStarted","Data":"66dd8c3ebd77da2c09352b187f493f6881b173d6e9829cd8ed9b9944fb01d3ea"} Oct 01 11:43:39 crc kubenswrapper[4669]: I1001 11:43:39.881373 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rfs5s" podStartSLOduration=19.832869859 podStartE2EDuration="28.881345849s" podCreationTimestamp="2025-10-01 11:43:11 +0000 UTC" firstStartedPulling="2025-10-01 11:43:24.659120619 +0000 UTC m=+895.758685586" lastFinishedPulling="2025-10-01 11:43:33.707596569 +0000 UTC m=+904.807161576" observedRunningTime="2025-10-01 11:43:39.880369915 +0000 UTC m=+910.979934932" watchObservedRunningTime="2025-10-01 11:43:39.881345849 +0000 UTC m=+910.980910866" Oct 01 11:43:41 crc kubenswrapper[4669]: I1001 11:43:41.900843 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:41 crc kubenswrapper[4669]: I1001 11:43:41.903296 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:41 crc kubenswrapper[4669]: I1001 11:43:41.945899 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:42 crc kubenswrapper[4669]: I1001 11:43:42.897106 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdh8p" event={"ID":"214eba80-e62b-41f4-8100-e63e9d8be3dd","Type":"ContainerStarted","Data":"9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642"} Oct 01 11:43:42 crc kubenswrapper[4669]: I1001 11:43:42.900751 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerStarted","Data":"d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6"} Oct 01 11:43:42 crc kubenswrapper[4669]: I1001 11:43:42.946991 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rdh8p" podStartSLOduration=15.893822137 podStartE2EDuration="32.946960049s" podCreationTimestamp="2025-10-01 11:43:10 +0000 UTC" firstStartedPulling="2025-10-01 11:43:24.593260691 +0000 UTC m=+895.692825668" lastFinishedPulling="2025-10-01 11:43:41.646398603 +0000 UTC m=+912.745963580" observedRunningTime="2025-10-01 11:43:42.929831568 +0000 UTC m=+914.029396565" watchObservedRunningTime="2025-10-01 11:43:42.946960049 +0000 UTC m=+914.046525036" Oct 01 11:43:42 crc kubenswrapper[4669]: I1001 11:43:42.958257 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8vc2c" podStartSLOduration=11.0211058 podStartE2EDuration="27.958233697s" podCreationTimestamp="2025-10-01 11:43:15 +0000 UTC" firstStartedPulling="2025-10-01 11:43:24.675774631 +0000 UTC m=+895.775339608" lastFinishedPulling="2025-10-01 11:43:41.612902528 +0000 UTC m=+912.712467505" observedRunningTime="2025-10-01 11:43:42.953846949 +0000 UTC m=+914.053411926" watchObservedRunningTime="2025-10-01 11:43:42.958233697 +0000 UTC m=+914.057798674" Oct 01 11:43:43 crc kubenswrapper[4669]: I1001 11:43:43.993687 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:45 crc kubenswrapper[4669]: I1001 11:43:45.485343 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:45 crc kubenswrapper[4669]: I1001 11:43:45.485416 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:46 crc kubenswrapper[4669]: I1001 11:43:46.560845 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8vc2c" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="registry-server" probeResult="failure" output=< Oct 01 11:43:46 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:43:46 crc kubenswrapper[4669]: > Oct 01 11:43:46 crc kubenswrapper[4669]: I1001 11:43:46.708860 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfs5s"] Oct 01 11:43:46 crc kubenswrapper[4669]: I1001 11:43:46.709333 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rfs5s" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="registry-server" containerID="cri-o://66dd8c3ebd77da2c09352b187f493f6881b173d6e9829cd8ed9b9944fb01d3ea" gracePeriod=2 Oct 01 11:43:46 crc kubenswrapper[4669]: I1001 11:43:46.948618 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa2b0226-7dae-4379-b54f-3650d7208784" containerID="66dd8c3ebd77da2c09352b187f493f6881b173d6e9829cd8ed9b9944fb01d3ea" exitCode=0 Oct 01 11:43:46 crc kubenswrapper[4669]: I1001 11:43:46.948686 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfs5s" event={"ID":"fa2b0226-7dae-4379-b54f-3650d7208784","Type":"ContainerDied","Data":"66dd8c3ebd77da2c09352b187f493f6881b173d6e9829cd8ed9b9944fb01d3ea"} Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.223221 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.325033 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-catalog-content\") pod \"fa2b0226-7dae-4379-b54f-3650d7208784\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.325132 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-utilities\") pod \"fa2b0226-7dae-4379-b54f-3650d7208784\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.325214 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27bww\" (UniqueName: \"kubernetes.io/projected/fa2b0226-7dae-4379-b54f-3650d7208784-kube-api-access-27bww\") pod \"fa2b0226-7dae-4379-b54f-3650d7208784\" (UID: \"fa2b0226-7dae-4379-b54f-3650d7208784\") " Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.327006 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-utilities" (OuterVolumeSpecName: "utilities") pod "fa2b0226-7dae-4379-b54f-3650d7208784" (UID: "fa2b0226-7dae-4379-b54f-3650d7208784"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.332879 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2b0226-7dae-4379-b54f-3650d7208784-kube-api-access-27bww" (OuterVolumeSpecName: "kube-api-access-27bww") pod "fa2b0226-7dae-4379-b54f-3650d7208784" (UID: "fa2b0226-7dae-4379-b54f-3650d7208784"). InnerVolumeSpecName "kube-api-access-27bww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.375628 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa2b0226-7dae-4379-b54f-3650d7208784" (UID: "fa2b0226-7dae-4379-b54f-3650d7208784"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.429475 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.429521 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b0226-7dae-4379-b54f-3650d7208784-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.429551 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27bww\" (UniqueName: \"kubernetes.io/projected/fa2b0226-7dae-4379-b54f-3650d7208784-kube-api-access-27bww\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.967046 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfs5s" event={"ID":"fa2b0226-7dae-4379-b54f-3650d7208784","Type":"ContainerDied","Data":"060a993444e4dc0ae7965b021374a7c6ab69d9205dcf27ffb316db3db8d3577c"} Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.967162 4669 scope.go:117] "RemoveContainer" containerID="66dd8c3ebd77da2c09352b187f493f6881b173d6e9829cd8ed9b9944fb01d3ea" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.967225 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfs5s" Oct 01 11:43:47 crc kubenswrapper[4669]: I1001 11:43:47.993806 4669 scope.go:117] "RemoveContainer" containerID="6776bcc87e8bbb03de288e0818804a613313a6da625b2e056e871a199a20fba2" Oct 01 11:43:48 crc kubenswrapper[4669]: I1001 11:43:48.005307 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfs5s"] Oct 01 11:43:48 crc kubenswrapper[4669]: I1001 11:43:48.012764 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rfs5s"] Oct 01 11:43:48 crc kubenswrapper[4669]: I1001 11:43:48.030646 4669 scope.go:117] "RemoveContainer" containerID="13d5f8920c9e231207cd43b86f963c417c2c836fbc99bebf9e2dee24ba209dde" Oct 01 11:43:49 crc kubenswrapper[4669]: I1001 11:43:49.661940 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" path="/var/lib/kubelet/pods/fa2b0226-7dae-4379-b54f-3650d7208784/volumes" Oct 01 11:43:50 crc kubenswrapper[4669]: I1001 11:43:50.702429 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:50 crc kubenswrapper[4669]: I1001 11:43:50.702521 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:50 crc kubenswrapper[4669]: I1001 11:43:50.766841 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:51 crc kubenswrapper[4669]: I1001 11:43:51.061832 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:51 crc kubenswrapper[4669]: I1001 11:43:51.907882 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdh8p"] Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.017309 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rdh8p" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="registry-server" containerID="cri-o://9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642" gracePeriod=2 Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.507991 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.528654 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-utilities\") pod \"214eba80-e62b-41f4-8100-e63e9d8be3dd\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.528715 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-catalog-content\") pod \"214eba80-e62b-41f4-8100-e63e9d8be3dd\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.528739 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7h8j\" (UniqueName: \"kubernetes.io/projected/214eba80-e62b-41f4-8100-e63e9d8be3dd-kube-api-access-p7h8j\") pod \"214eba80-e62b-41f4-8100-e63e9d8be3dd\" (UID: \"214eba80-e62b-41f4-8100-e63e9d8be3dd\") " Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.537640 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214eba80-e62b-41f4-8100-e63e9d8be3dd-kube-api-access-p7h8j" (OuterVolumeSpecName: "kube-api-access-p7h8j") pod "214eba80-e62b-41f4-8100-e63e9d8be3dd" (UID: "214eba80-e62b-41f4-8100-e63e9d8be3dd"). InnerVolumeSpecName "kube-api-access-p7h8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.543952 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-utilities" (OuterVolumeSpecName: "utilities") pod "214eba80-e62b-41f4-8100-e63e9d8be3dd" (UID: "214eba80-e62b-41f4-8100-e63e9d8be3dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.584753 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "214eba80-e62b-41f4-8100-e63e9d8be3dd" (UID: "214eba80-e62b-41f4-8100-e63e9d8be3dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.630317 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.630352 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7h8j\" (UniqueName: \"kubernetes.io/projected/214eba80-e62b-41f4-8100-e63e9d8be3dd-kube-api-access-p7h8j\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:53 crc kubenswrapper[4669]: I1001 11:43:53.630366 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/214eba80-e62b-41f4-8100-e63e9d8be3dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.035400 4669 generic.go:334] "Generic (PLEG): container finished" podID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerID="9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642" exitCode=0 Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.035495 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdh8p" event={"ID":"214eba80-e62b-41f4-8100-e63e9d8be3dd","Type":"ContainerDied","Data":"9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642"} Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.035504 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdh8p" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.035569 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdh8p" event={"ID":"214eba80-e62b-41f4-8100-e63e9d8be3dd","Type":"ContainerDied","Data":"ed751432bd287b204dbfc7ec68e78fe679db13b662d26db40d79061dfc40da37"} Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.035618 4669 scope.go:117] "RemoveContainer" containerID="9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.067865 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdh8p"] Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.074041 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rdh8p"] Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.076171 4669 scope.go:117] "RemoveContainer" containerID="c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.100192 4669 scope.go:117] "RemoveContainer" containerID="ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.132126 4669 scope.go:117] "RemoveContainer" containerID="9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642" Oct 01 11:43:54 crc kubenswrapper[4669]: E1001 11:43:54.132793 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642\": container with ID starting with 9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642 not found: ID does not exist" containerID="9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.132875 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642"} err="failed to get container status \"9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642\": rpc error: code = NotFound desc = could not find container \"9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642\": container with ID starting with 9788288159bd1462734f23d81f8361d0cfbab3f4973f48f77bf19fe757db1642 not found: ID does not exist" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.132916 4669 scope.go:117] "RemoveContainer" containerID="c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b" Oct 01 11:43:54 crc kubenswrapper[4669]: E1001 11:43:54.133281 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b\": container with ID starting with c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b not found: ID does not exist" containerID="c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.133314 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b"} err="failed to get container status \"c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b\": rpc error: code = NotFound desc = could not find container \"c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b\": container with ID starting with c19c79ecbff76ffc22480f5b061ad6233243635087bd65bc38e6371bf6ac8b1b not found: ID does not exist" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.133331 4669 scope.go:117] "RemoveContainer" containerID="ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b" Oct 01 11:43:54 crc kubenswrapper[4669]: E1001 11:43:54.133916 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b\": container with ID starting with ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b not found: ID does not exist" containerID="ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b" Oct 01 11:43:54 crc kubenswrapper[4669]: I1001 11:43:54.133967 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b"} err="failed to get container status \"ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b\": rpc error: code = NotFound desc = could not find container \"ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b\": container with ID starting with ce066b20a99ca6cdafbab7225bfe0b89e6d9f635fdeece5adcd766c98e59a30b not found: ID does not exist" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.398709 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rghs8"] Oct 01 11:43:55 crc kubenswrapper[4669]: E1001 11:43:55.399357 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="registry-server" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399371 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="registry-server" Oct 01 11:43:55 crc kubenswrapper[4669]: E1001 11:43:55.399404 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="extract-content" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399411 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="extract-content" Oct 01 11:43:55 crc kubenswrapper[4669]: E1001 11:43:55.399432 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="extract-content" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399439 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="extract-content" Oct 01 11:43:55 crc kubenswrapper[4669]: E1001 11:43:55.399456 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="extract-utilities" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399461 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="extract-utilities" Oct 01 11:43:55 crc kubenswrapper[4669]: E1001 11:43:55.399472 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="registry-server" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399478 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="registry-server" Oct 01 11:43:55 crc kubenswrapper[4669]: E1001 11:43:55.399493 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="extract-utilities" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399499 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="extract-utilities" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399667 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" containerName="registry-server" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.399683 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2b0226-7dae-4379-b54f-3650d7208784" containerName="registry-server" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.400530 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.405484 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.405655 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.405793 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.405947 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vjs2g" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.411968 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rghs8"] Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.462512 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bb4qz"] Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.464894 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.465981 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adaa222-a936-47ee-bec6-facc69d10d36-config\") pod \"dnsmasq-dns-675f4bcbfc-rghs8\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.466047 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbf5\" (UniqueName: \"kubernetes.io/projected/6adaa222-a936-47ee-bec6-facc69d10d36-kube-api-access-4rbf5\") pod \"dnsmasq-dns-675f4bcbfc-rghs8\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.467941 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.479548 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bb4qz"] Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.551942 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.567510 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.567593 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adaa222-a936-47ee-bec6-facc69d10d36-config\") pod \"dnsmasq-dns-675f4bcbfc-rghs8\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.567650 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htbt2\" (UniqueName: \"kubernetes.io/projected/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-kube-api-access-htbt2\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.567692 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbf5\" (UniqueName: \"kubernetes.io/projected/6adaa222-a936-47ee-bec6-facc69d10d36-kube-api-access-4rbf5\") pod \"dnsmasq-dns-675f4bcbfc-rghs8\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.567746 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-config\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.569205 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adaa222-a936-47ee-bec6-facc69d10d36-config\") pod \"dnsmasq-dns-675f4bcbfc-rghs8\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.591886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbf5\" (UniqueName: \"kubernetes.io/projected/6adaa222-a936-47ee-bec6-facc69d10d36-kube-api-access-4rbf5\") pod \"dnsmasq-dns-675f4bcbfc-rghs8\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.594355 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.654990 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214eba80-e62b-41f4-8100-e63e9d8be3dd" path="/var/lib/kubelet/pods/214eba80-e62b-41f4-8100-e63e9d8be3dd/volumes" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.669066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-config\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.669190 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.669238 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htbt2\" (UniqueName: \"kubernetes.io/projected/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-kube-api-access-htbt2\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.670196 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.670210 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-config\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.688412 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htbt2\" (UniqueName: \"kubernetes.io/projected/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-kube-api-access-htbt2\") pod \"dnsmasq-dns-78dd6ddcc-bb4qz\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.719780 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:43:55 crc kubenswrapper[4669]: I1001 11:43:55.782409 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:43:56 crc kubenswrapper[4669]: I1001 11:43:56.237921 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rghs8"] Oct 01 11:43:56 crc kubenswrapper[4669]: W1001 11:43:56.248001 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6adaa222_a936_47ee_bec6_facc69d10d36.slice/crio-0be579c7ba588400caee7656af56aa7893a4b52d2dab4dbf4104f9deac4809a4 WatchSource:0}: Error finding container 0be579c7ba588400caee7656af56aa7893a4b52d2dab4dbf4104f9deac4809a4: Status 404 returned error can't find the container with id 0be579c7ba588400caee7656af56aa7893a4b52d2dab4dbf4104f9deac4809a4 Oct 01 11:43:56 crc kubenswrapper[4669]: I1001 11:43:56.305735 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bb4qz"] Oct 01 11:43:56 crc kubenswrapper[4669]: W1001 11:43:56.308486 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca892c75_1ca1_4192_b6da_7ac18e3eba1a.slice/crio-8f0b3ad6863dd027e2cdceacf295da2e6e56f48dd4609ea64d5e18802c41ef55 WatchSource:0}: Error finding container 8f0b3ad6863dd027e2cdceacf295da2e6e56f48dd4609ea64d5e18802c41ef55: Status 404 returned error can't find the container with id 8f0b3ad6863dd027e2cdceacf295da2e6e56f48dd4609ea64d5e18802c41ef55 Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.082802 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" event={"ID":"ca892c75-1ca1-4192-b6da-7ac18e3eba1a","Type":"ContainerStarted","Data":"8f0b3ad6863dd027e2cdceacf295da2e6e56f48dd4609ea64d5e18802c41ef55"} Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.086620 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" event={"ID":"6adaa222-a936-47ee-bec6-facc69d10d36","Type":"ContainerStarted","Data":"0be579c7ba588400caee7656af56aa7893a4b52d2dab4dbf4104f9deac4809a4"} Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.106520 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8vc2c"] Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.107135 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8vc2c" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="registry-server" containerID="cri-o://d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6" gracePeriod=2 Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.513671 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.719706 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-catalog-content\") pod \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.720230 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-utilities\") pod \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.720378 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5vjw\" (UniqueName: \"kubernetes.io/projected/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-kube-api-access-z5vjw\") pod \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\" (UID: \"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969\") " Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.722140 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-utilities" (OuterVolumeSpecName: "utilities") pod "049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" (UID: "049fd2a0-3cfd-4c63-a2e3-5dde72ebd969"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.725761 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.730293 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-kube-api-access-z5vjw" (OuterVolumeSpecName: "kube-api-access-z5vjw") pod "049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" (UID: "049fd2a0-3cfd-4c63-a2e3-5dde72ebd969"). InnerVolumeSpecName "kube-api-access-z5vjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.812007 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" (UID: "049fd2a0-3cfd-4c63-a2e3-5dde72ebd969"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.828541 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5vjw\" (UniqueName: \"kubernetes.io/projected/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-kube-api-access-z5vjw\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.828590 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.954805 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rghs8"] Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.992571 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l9zjr"] Oct 01 11:43:57 crc kubenswrapper[4669]: E1001 11:43:57.992999 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="extract-content" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.993027 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="extract-content" Oct 01 11:43:57 crc kubenswrapper[4669]: E1001 11:43:57.993053 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="extract-utilities" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.993063 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="extract-utilities" Oct 01 11:43:57 crc kubenswrapper[4669]: E1001 11:43:57.993125 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="registry-server" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.993135 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="registry-server" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.993347 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerName="registry-server" Oct 01 11:43:57 crc kubenswrapper[4669]: I1001 11:43:57.994306 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.018726 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l9zjr"] Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.049217 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8x72\" (UniqueName: \"kubernetes.io/projected/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-kube-api-access-s8x72\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.049616 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.049724 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-config\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.102879 4669 generic.go:334] "Generic (PLEG): container finished" podID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" containerID="d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6" exitCode=0 Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.102946 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerDied","Data":"d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6"} Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.102985 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vc2c" event={"ID":"049fd2a0-3cfd-4c63-a2e3-5dde72ebd969","Type":"ContainerDied","Data":"de99bf54e60feb896dfe7fa14a2c5414b16db4f02ad30e51597623ce81f68217"} Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.103009 4669 scope.go:117] "RemoveContainer" containerID="d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.103254 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vc2c" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.144648 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8vc2c"] Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.153325 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8x72\" (UniqueName: \"kubernetes.io/projected/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-kube-api-access-s8x72\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.153424 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.153462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-config\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.154786 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-config\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.155196 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.157589 4669 scope.go:117] "RemoveContainer" containerID="fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.164000 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8vc2c"] Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.178357 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8x72\" (UniqueName: \"kubernetes.io/projected/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-kube-api-access-s8x72\") pod \"dnsmasq-dns-666b6646f7-l9zjr\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.254182 4669 scope.go:117] "RemoveContainer" containerID="84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.355484 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.362295 4669 scope.go:117] "RemoveContainer" containerID="d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6" Oct 01 11:43:58 crc kubenswrapper[4669]: E1001 11:43:58.363152 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6\": container with ID starting with d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6 not found: ID does not exist" containerID="d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.363202 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6"} err="failed to get container status \"d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6\": rpc error: code = NotFound desc = could not find container \"d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6\": container with ID starting with d62888f019ffb5b3ca2de52b2d02025930d2b4a6042fb68ba9ad9ef75209f4f6 not found: ID does not exist" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.363236 4669 scope.go:117] "RemoveContainer" containerID="fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11" Oct 01 11:43:58 crc kubenswrapper[4669]: E1001 11:43:58.371287 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11\": container with ID starting with fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11 not found: ID does not exist" containerID="fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.371343 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11"} err="failed to get container status \"fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11\": rpc error: code = NotFound desc = could not find container \"fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11\": container with ID starting with fa637f94ded971df16503ef2b6528429395dece6e915a1f628ebbc6b4986dc11 not found: ID does not exist" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.371400 4669 scope.go:117] "RemoveContainer" containerID="84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.363015 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bb4qz"] Oct 01 11:43:58 crc kubenswrapper[4669]: E1001 11:43:58.376240 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b\": container with ID starting with 84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b not found: ID does not exist" containerID="84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.376279 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b"} err="failed to get container status \"84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b\": rpc error: code = NotFound desc = could not find container \"84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b\": container with ID starting with 84e3242eeaed856f0043104008ff759dbb37349d6365b00aadfa9ce43b31015b not found: ID does not exist" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.421723 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mlrtj"] Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.423060 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.452489 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mlrtj"] Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.463938 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-config\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.464631 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5sn\" (UniqueName: \"kubernetes.io/projected/e20229bd-54f8-4a0e-a2d4-42102e32950e-kube-api-access-fv5sn\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.464854 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.567338 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.567429 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-config\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.567473 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5sn\" (UniqueName: \"kubernetes.io/projected/e20229bd-54f8-4a0e-a2d4-42102e32950e-kube-api-access-fv5sn\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.568309 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.568359 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-config\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.607748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5sn\" (UniqueName: \"kubernetes.io/projected/e20229bd-54f8-4a0e-a2d4-42102e32950e-kube-api-access-fv5sn\") pod \"dnsmasq-dns-57d769cc4f-mlrtj\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.760618 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:43:58 crc kubenswrapper[4669]: I1001 11:43:58.872270 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l9zjr"] Oct 01 11:43:58 crc kubenswrapper[4669]: W1001 11:43:58.929217 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de1ee14_9826_4903_a1d3_ee0b8c2416c6.slice/crio-dd0ae566256a2d3f23c4c99bddae1842ff62c7a77b29cd061d1abef7409414c4 WatchSource:0}: Error finding container dd0ae566256a2d3f23c4c99bddae1842ff62c7a77b29cd061d1abef7409414c4: Status 404 returned error can't find the container with id dd0ae566256a2d3f23c4c99bddae1842ff62c7a77b29cd061d1abef7409414c4 Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.117968 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" event={"ID":"8de1ee14-9826-4903-a1d3-ee0b8c2416c6","Type":"ContainerStarted","Data":"dd0ae566256a2d3f23c4c99bddae1842ff62c7a77b29cd061d1abef7409414c4"} Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.160117 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.161524 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.167207 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vq4fz" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.167313 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.167391 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.167592 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.168584 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.169431 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.185425 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.189785 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.288604 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-config-data\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.288689 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.288719 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289435 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500653c7-d0f6-46d5-9411-60a17569fdd3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289468 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52c8\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-kube-api-access-l52c8\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289630 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289697 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289780 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500653c7-d0f6-46d5-9411-60a17569fdd3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289852 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.289880 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.391402 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.391783 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500653c7-d0f6-46d5-9411-60a17569fdd3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.391810 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52c8\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-kube-api-access-l52c8\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392695 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392724 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392754 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500653c7-d0f6-46d5-9411-60a17569fdd3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392776 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392810 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392870 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-config-data\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392905 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.392929 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.393188 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.396811 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.396831 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.397503 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.397620 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-config-data\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.398200 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.398328 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500653c7-d0f6-46d5-9411-60a17569fdd3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.398922 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500653c7-d0f6-46d5-9411-60a17569fdd3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.399434 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.400024 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.412038 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52c8\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-kube-api-access-l52c8\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.419736 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.431470 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mlrtj"] Oct 01 11:43:59 crc kubenswrapper[4669]: W1001 11:43:59.453348 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20229bd_54f8_4a0e_a2d4_42102e32950e.slice/crio-7b2cf604c60fc1d0d97ec2c7be8a585388ebd4fe20880e6737974081da400127 WatchSource:0}: Error finding container 7b2cf604c60fc1d0d97ec2c7be8a585388ebd4fe20880e6737974081da400127: Status 404 returned error can't find the container with id 7b2cf604c60fc1d0d97ec2c7be8a585388ebd4fe20880e6737974081da400127 Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.509787 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.510288 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.516040 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.521643 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.522110 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4gvdd" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.522326 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.522410 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.522497 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.522741 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.522859 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.532901 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.687309 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049fd2a0-3cfd-4c63-a2e3-5dde72ebd969" path="/var/lib/kubelet/pods/049fd2a0-3cfd-4c63-a2e3-5dde72ebd969/volumes" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.697813 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.697868 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq558\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-kube-api-access-gq558\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.697899 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4619f705-9393-48c8-bc69-2d6183546af2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.697923 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.697938 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.697962 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.698019 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4619f705-9393-48c8-bc69-2d6183546af2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.698044 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.698071 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.698181 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.698204 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799627 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799669 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799873 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799899 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq558\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-kube-api-access-gq558\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799921 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4619f705-9393-48c8-bc69-2d6183546af2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799942 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799959 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.799994 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.800032 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4619f705-9393-48c8-bc69-2d6183546af2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.800067 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.800100 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.801541 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.803385 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.803677 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.804098 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.804387 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.804412 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.819426 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.820423 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4619f705-9393-48c8-bc69-2d6183546af2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.825729 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq558\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-kube-api-access-gq558\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.825842 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4619f705-9393-48c8-bc69-2d6183546af2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.826601 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:43:59 crc kubenswrapper[4669]: I1001 11:43:59.847539 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:44:00 crc kubenswrapper[4669]: I1001 11:44:00.018900 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:44:00 crc kubenswrapper[4669]: W1001 11:44:00.032006 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500653c7_d0f6_46d5_9411_60a17569fdd3.slice/crio-2a403e600710d456e7a10d42b82b0ac1d27de0eca030353e1597e302cfd29dfd WatchSource:0}: Error finding container 2a403e600710d456e7a10d42b82b0ac1d27de0eca030353e1597e302cfd29dfd: Status 404 returned error can't find the container with id 2a403e600710d456e7a10d42b82b0ac1d27de0eca030353e1597e302cfd29dfd Oct 01 11:44:00 crc kubenswrapper[4669]: I1001 11:44:00.142871 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" event={"ID":"e20229bd-54f8-4a0e-a2d4-42102e32950e","Type":"ContainerStarted","Data":"7b2cf604c60fc1d0d97ec2c7be8a585388ebd4fe20880e6737974081da400127"} Oct 01 11:44:00 crc kubenswrapper[4669]: I1001 11:44:00.144117 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:44:00 crc kubenswrapper[4669]: I1001 11:44:00.147879 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"500653c7-d0f6-46d5-9411-60a17569fdd3","Type":"ContainerStarted","Data":"2a403e600710d456e7a10d42b82b0ac1d27de0eca030353e1597e302cfd29dfd"} Oct 01 11:44:00 crc kubenswrapper[4669]: I1001 11:44:00.637088 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.170285 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4619f705-9393-48c8-bc69-2d6183546af2","Type":"ContainerStarted","Data":"036559d558945151b60ce4d18cb5d38688b0ad17d7e90e5898bb61e6b0c23e8c"} Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.503806 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.508381 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.510595 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.510972 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9pr77" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.511143 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.511387 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.514545 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.517678 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.529809 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.630720 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-config-data-default\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631276 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-kolla-config\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631300 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-secrets\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631400 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-operator-scripts\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631441 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/872d79b4-0374-4e78-98e4-32393e2f7f05-config-data-generated\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631469 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631510 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631531 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.631555 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82xm\" (UniqueName: \"kubernetes.io/projected/872d79b4-0374-4e78-98e4-32393e2f7f05-kube-api-access-h82xm\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82xm\" (UniqueName: \"kubernetes.io/projected/872d79b4-0374-4e78-98e4-32393e2f7f05-kube-api-access-h82xm\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733529 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-config-data-default\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733605 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-secrets\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733630 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-kolla-config\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733684 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-operator-scripts\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733748 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/872d79b4-0374-4e78-98e4-32393e2f7f05-config-data-generated\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733768 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733827 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.733869 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.734599 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-config-data-default\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.734966 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.736408 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-kolla-config\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.736824 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/872d79b4-0374-4e78-98e4-32393e2f7f05-config-data-generated\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.738201 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872d79b4-0374-4e78-98e4-32393e2f7f05-operator-scripts\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.744993 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-secrets\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.745450 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.748797 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872d79b4-0374-4e78-98e4-32393e2f7f05-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.759324 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82xm\" (UniqueName: \"kubernetes.io/projected/872d79b4-0374-4e78-98e4-32393e2f7f05-kube-api-access-h82xm\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.760294 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"872d79b4-0374-4e78-98e4-32393e2f7f05\") " pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.844011 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.931923 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.934070 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.939158 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.939510 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9txfw" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.939773 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.939932 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 11:44:01 crc kubenswrapper[4669]: I1001 11:44:01.946555 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040561 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z42c\" (UniqueName: \"kubernetes.io/projected/92bd05a8-df03-4e85-b32a-dc3ced713159-kube-api-access-8z42c\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040629 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040671 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040708 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040740 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040778 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040814 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040851 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92bd05a8-df03-4e85-b32a-dc3ced713159-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.040920 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.142919 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143156 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z42c\" (UniqueName: \"kubernetes.io/projected/92bd05a8-df03-4e85-b32a-dc3ced713159-kube-api-access-8z42c\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143191 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143214 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143236 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143288 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143317 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92bd05a8-df03-4e85-b32a-dc3ced713159-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.143364 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.144262 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.145289 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92bd05a8-df03-4e85-b32a-dc3ced713159-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.145393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.145748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.148162 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.149784 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.151671 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92bd05a8-df03-4e85-b32a-dc3ced713159-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.154358 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92bd05a8-df03-4e85-b32a-dc3ced713159-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.162722 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z42c\" (UniqueName: \"kubernetes.io/projected/92bd05a8-df03-4e85-b32a-dc3ced713159-kube-api-access-8z42c\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.176934 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92bd05a8-df03-4e85-b32a-dc3ced713159\") " pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.270617 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.426891 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.428006 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.431199 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.431424 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-49vqc" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.431534 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.454630 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.576918 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dda17c6-d274-4975-8796-deda5fd09e9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.577001 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f4wh\" (UniqueName: \"kubernetes.io/projected/0dda17c6-d274-4975-8796-deda5fd09e9c-kube-api-access-5f4wh\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.577037 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dda17c6-d274-4975-8796-deda5fd09e9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.577466 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0dda17c6-d274-4975-8796-deda5fd09e9c-config-data\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.577692 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0dda17c6-d274-4975-8796-deda5fd09e9c-kolla-config\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.679264 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0dda17c6-d274-4975-8796-deda5fd09e9c-kolla-config\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.679365 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dda17c6-d274-4975-8796-deda5fd09e9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.679403 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dda17c6-d274-4975-8796-deda5fd09e9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.679428 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f4wh\" (UniqueName: \"kubernetes.io/projected/0dda17c6-d274-4975-8796-deda5fd09e9c-kube-api-access-5f4wh\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.679509 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0dda17c6-d274-4975-8796-deda5fd09e9c-config-data\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.680947 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0dda17c6-d274-4975-8796-deda5fd09e9c-kolla-config\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.681512 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0dda17c6-d274-4975-8796-deda5fd09e9c-config-data\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.688321 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dda17c6-d274-4975-8796-deda5fd09e9c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.694052 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dda17c6-d274-4975-8796-deda5fd09e9c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.703348 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f4wh\" (UniqueName: \"kubernetes.io/projected/0dda17c6-d274-4975-8796-deda5fd09e9c-kube-api-access-5f4wh\") pod \"memcached-0\" (UID: \"0dda17c6-d274-4975-8796-deda5fd09e9c\") " pod="openstack/memcached-0" Oct 01 11:44:02 crc kubenswrapper[4669]: I1001 11:44:02.832121 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.235435 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.245901 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.249409 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.254696 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-kjnss" Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.423287 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2pf\" (UniqueName: \"kubernetes.io/projected/9d832718-661a-44fb-bcc8-7f48af908b15-kube-api-access-tn2pf\") pod \"kube-state-metrics-0\" (UID: \"9d832718-661a-44fb-bcc8-7f48af908b15\") " pod="openstack/kube-state-metrics-0" Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.525053 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2pf\" (UniqueName: \"kubernetes.io/projected/9d832718-661a-44fb-bcc8-7f48af908b15-kube-api-access-tn2pf\") pod \"kube-state-metrics-0\" (UID: \"9d832718-661a-44fb-bcc8-7f48af908b15\") " pod="openstack/kube-state-metrics-0" Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.548627 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2pf\" (UniqueName: \"kubernetes.io/projected/9d832718-661a-44fb-bcc8-7f48af908b15-kube-api-access-tn2pf\") pod \"kube-state-metrics-0\" (UID: \"9d832718-661a-44fb-bcc8-7f48af908b15\") " pod="openstack/kube-state-metrics-0" Oct 01 11:44:04 crc kubenswrapper[4669]: I1001 11:44:04.573670 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.065564 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-plhdj"] Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.067591 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.078062 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nf7l5" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.078293 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.079392 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.079537 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plhdj"] Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.086210 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-d5fz7"] Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.088281 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.148361 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-d5fz7"] Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.173895 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-etc-ovs\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.173994 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cql\" (UniqueName: \"kubernetes.io/projected/c5ffe639-af06-4c4c-8794-a1becff8a692-kube-api-access-p6cql\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174089 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-run\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174167 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-run-ovn\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174305 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-scripts\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174404 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ffe639-af06-4c4c-8794-a1becff8a692-combined-ca-bundle\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174519 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-log\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174672 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-run\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174822 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ffe639-af06-4c4c-8794-a1becff8a692-scripts\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.174911 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-lib\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.175004 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-log-ovn\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.175038 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrmq\" (UniqueName: \"kubernetes.io/projected/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-kube-api-access-jvrmq\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.175279 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ffe639-af06-4c4c-8794-a1becff8a692-ovn-controller-tls-certs\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277427 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-log-ovn\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277493 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrmq\" (UniqueName: \"kubernetes.io/projected/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-kube-api-access-jvrmq\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277559 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ffe639-af06-4c4c-8794-a1becff8a692-ovn-controller-tls-certs\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277600 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-etc-ovs\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277642 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cql\" (UniqueName: \"kubernetes.io/projected/c5ffe639-af06-4c4c-8794-a1becff8a692-kube-api-access-p6cql\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277686 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-run\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277723 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-run-ovn\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277761 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-scripts\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277784 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ffe639-af06-4c4c-8794-a1becff8a692-combined-ca-bundle\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277816 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-log\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277843 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-run\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277874 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ffe639-af06-4c4c-8794-a1becff8a692-scripts\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.277903 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-lib\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278141 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-log-ovn\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278317 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-lib\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278365 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-run\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278476 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ffe639-af06-4c4c-8794-a1becff8a692-var-run-ovn\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278524 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-run\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278607 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-var-log\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.278893 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-etc-ovs\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.280596 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-scripts\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.281443 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ffe639-af06-4c4c-8794-a1becff8a692-scripts\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.286191 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ffe639-af06-4c4c-8794-a1becff8a692-combined-ca-bundle\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.290150 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ffe639-af06-4c4c-8794-a1becff8a692-ovn-controller-tls-certs\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.305897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cql\" (UniqueName: \"kubernetes.io/projected/c5ffe639-af06-4c4c-8794-a1becff8a692-kube-api-access-p6cql\") pod \"ovn-controller-plhdj\" (UID: \"c5ffe639-af06-4c4c-8794-a1becff8a692\") " pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.306338 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrmq\" (UniqueName: \"kubernetes.io/projected/1c9e9459-07b3-4f2d-9385-7c41a5bb6edd-kube-api-access-jvrmq\") pod \"ovn-controller-ovs-d5fz7\" (UID: \"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd\") " pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.436697 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj" Oct 01 11:44:07 crc kubenswrapper[4669]: I1001 11:44:07.447286 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.502445 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.505455 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.508213 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.508583 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.508840 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pn72j" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.509125 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.510519 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.535861 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607540 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607644 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607673 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbvk\" (UniqueName: \"kubernetes.io/projected/76c8bfa8-2fca-4a74-85e8-f44af35d612f-kube-api-access-nlbvk\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607733 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607760 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607790 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76c8bfa8-2fca-4a74-85e8-f44af35d612f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607832 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c8bfa8-2fca-4a74-85e8-f44af35d612f-config\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.607857 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76c8bfa8-2fca-4a74-85e8-f44af35d612f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709671 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709740 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709783 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76c8bfa8-2fca-4a74-85e8-f44af35d612f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709827 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c8bfa8-2fca-4a74-85e8-f44af35d612f-config\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709856 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76c8bfa8-2fca-4a74-85e8-f44af35d612f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709939 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.709981 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.710004 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbvk\" (UniqueName: \"kubernetes.io/projected/76c8bfa8-2fca-4a74-85e8-f44af35d612f-kube-api-access-nlbvk\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.710811 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.711412 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c8bfa8-2fca-4a74-85e8-f44af35d612f-config\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.711877 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76c8bfa8-2fca-4a74-85e8-f44af35d612f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.711893 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76c8bfa8-2fca-4a74-85e8-f44af35d612f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.714939 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.717832 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.726296 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c8bfa8-2fca-4a74-85e8-f44af35d612f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.738785 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbvk\" (UniqueName: \"kubernetes.io/projected/76c8bfa8-2fca-4a74-85e8-f44af35d612f-kube-api-access-nlbvk\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.755472 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"76c8bfa8-2fca-4a74-85e8-f44af35d612f\") " pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:08 crc kubenswrapper[4669]: I1001 11:44:08.840155 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.884491 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.886976 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.892055 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qwfcr" Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.892312 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.892710 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.893100 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 11:44:11 crc kubenswrapper[4669]: I1001 11:44:11.900944 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.011136 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.011467 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d13ad6e-a577-4f92-95ea-8ad268373774-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.011617 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.011778 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.011903 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chblr\" (UniqueName: \"kubernetes.io/projected/1d13ad6e-a577-4f92-95ea-8ad268373774-kube-api-access-chblr\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.012036 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d13ad6e-a577-4f92-95ea-8ad268373774-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.012387 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.012515 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d13ad6e-a577-4f92-95ea-8ad268373774-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.090969 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114258 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114318 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chblr\" (UniqueName: \"kubernetes.io/projected/1d13ad6e-a577-4f92-95ea-8ad268373774-kube-api-access-chblr\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114349 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d13ad6e-a577-4f92-95ea-8ad268373774-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114404 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114421 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d13ad6e-a577-4f92-95ea-8ad268373774-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114481 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114509 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d13ad6e-a577-4f92-95ea-8ad268373774-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.114525 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.115577 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d13ad6e-a577-4f92-95ea-8ad268373774-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.115734 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.116932 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d13ad6e-a577-4f92-95ea-8ad268373774-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.117146 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d13ad6e-a577-4f92-95ea-8ad268373774-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.122572 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.122966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.132665 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chblr\" (UniqueName: \"kubernetes.io/projected/1d13ad6e-a577-4f92-95ea-8ad268373774-kube-api-access-chblr\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.135034 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.139571 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d13ad6e-a577-4f92-95ea-8ad268373774-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d13ad6e-a577-4f92-95ea-8ad268373774\") " pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:12 crc kubenswrapper[4669]: I1001 11:44:12.216273 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:18 crc kubenswrapper[4669]: W1001 11:44:18.772388 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dda17c6_d274_4975_8796_deda5fd09e9c.slice/crio-6b88236f828ed1a4268e92930d3005e224f525333a39efa3fd86578f89bf8085 WatchSource:0}: Error finding container 6b88236f828ed1a4268e92930d3005e224f525333a39efa3fd86578f89bf8085: Status 404 returned error can't find the container with id 6b88236f828ed1a4268e92930d3005e224f525333a39efa3fd86578f89bf8085 Oct 01 11:44:18 crc kubenswrapper[4669]: I1001 11:44:18.782487 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:44:19 crc kubenswrapper[4669]: I1001 11:44:19.372517 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0dda17c6-d274-4975-8796-deda5fd09e9c","Type":"ContainerStarted","Data":"6b88236f828ed1a4268e92930d3005e224f525333a39efa3fd86578f89bf8085"} Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.647008 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.647635 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s8x72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-l9zjr_openstack(8de1ee14-9826-4903-a1d3-ee0b8c2416c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.649153 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" podUID="8de1ee14-9826-4903-a1d3-ee0b8c2416c6" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.760723 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.761222 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rbf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rghs8_openstack(6adaa222-a936-47ee-bec6-facc69d10d36): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.764255 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" podUID="6adaa222-a936-47ee-bec6-facc69d10d36" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.786848 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.787011 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-htbt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bb4qz_openstack(ca892c75-1ca1-4192-b6da-7ac18e3eba1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.788526 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" podUID="ca892c75-1ca1-4192-b6da-7ac18e3eba1a" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.830489 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.831674 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fv5sn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-mlrtj_openstack(e20229bd-54f8-4a0e-a2d4-42102e32950e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:44:19 crc kubenswrapper[4669]: E1001 11:44:19.833131 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" podUID="e20229bd-54f8-4a0e-a2d4-42102e32950e" Oct 01 11:44:20 crc kubenswrapper[4669]: W1001 11:44:20.154989 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod872d79b4_0374_4e78_98e4_32393e2f7f05.slice/crio-e65eb5ebc060f78b1523c4aa4a1b3740254e6c26fd9842e1c61bb176e9f7e642 WatchSource:0}: Error finding container e65eb5ebc060f78b1523c4aa4a1b3740254e6c26fd9842e1c61bb176e9f7e642: Status 404 returned error can't find the container with id e65eb5ebc060f78b1523c4aa4a1b3740254e6c26fd9842e1c61bb176e9f7e642 Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.162292 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.272598 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.280534 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plhdj"] Oct 01 11:44:20 crc kubenswrapper[4669]: W1001 11:44:20.288041 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92bd05a8_df03_4e85_b32a_dc3ced713159.slice/crio-dc493d5a95d91762c74270329e159634dbc7acaa390ccd6cbc95bbb5ef641bc0 WatchSource:0}: Error finding container dc493d5a95d91762c74270329e159634dbc7acaa390ccd6cbc95bbb5ef641bc0: Status 404 returned error can't find the container with id dc493d5a95d91762c74270329e159634dbc7acaa390ccd6cbc95bbb5ef641bc0 Oct 01 11:44:20 crc kubenswrapper[4669]: W1001 11:44:20.296741 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5ffe639_af06_4c4c_8794_a1becff8a692.slice/crio-007443e0314c3720ee91228eba877261e782b247d706e8edbce45406e84f0cef WatchSource:0}: Error finding container 007443e0314c3720ee91228eba877261e782b247d706e8edbce45406e84f0cef: Status 404 returned error can't find the container with id 007443e0314c3720ee91228eba877261e782b247d706e8edbce45406e84f0cef Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.359576 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 11:44:20 crc kubenswrapper[4669]: W1001 11:44:20.367601 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76c8bfa8_2fca_4a74_85e8_f44af35d612f.slice/crio-883a43aba813d087299e4bbba13602a149664ed3d78c81868cca742fbb846ed0 WatchSource:0}: Error finding container 883a43aba813d087299e4bbba13602a149664ed3d78c81868cca742fbb846ed0: Status 404 returned error can't find the container with id 883a43aba813d087299e4bbba13602a149664ed3d78c81868cca742fbb846ed0 Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.383791 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"76c8bfa8-2fca-4a74-85e8-f44af35d612f","Type":"ContainerStarted","Data":"883a43aba813d087299e4bbba13602a149664ed3d78c81868cca742fbb846ed0"} Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.385179 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92bd05a8-df03-4e85-b32a-dc3ced713159","Type":"ContainerStarted","Data":"dc493d5a95d91762c74270329e159634dbc7acaa390ccd6cbc95bbb5ef641bc0"} Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.386343 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"872d79b4-0374-4e78-98e4-32393e2f7f05","Type":"ContainerStarted","Data":"e65eb5ebc060f78b1523c4aa4a1b3740254e6c26fd9842e1c61bb176e9f7e642"} Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.387701 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj" event={"ID":"c5ffe639-af06-4c4c-8794-a1becff8a692","Type":"ContainerStarted","Data":"007443e0314c3720ee91228eba877261e782b247d706e8edbce45406e84f0cef"} Oct 01 11:44:20 crc kubenswrapper[4669]: E1001 11:44:20.430939 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" podUID="e20229bd-54f8-4a0e-a2d4-42102e32950e" Oct 01 11:44:20 crc kubenswrapper[4669]: E1001 11:44:20.430950 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" podUID="8de1ee14-9826-4903-a1d3-ee0b8c2416c6" Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.465000 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:44:20 crc kubenswrapper[4669]: W1001 11:44:20.538657 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d832718_661a_44fb_bcc8_7f48af908b15.slice/crio-c2199de0a4468222ad06e6c8509deff730da8185a2e350a8c2d0de967188b3f7 WatchSource:0}: Error finding container c2199de0a4468222ad06e6c8509deff730da8185a2e350a8c2d0de967188b3f7: Status 404 returned error can't find the container with id c2199de0a4468222ad06e6c8509deff730da8185a2e350a8c2d0de967188b3f7 Oct 01 11:44:20 crc kubenswrapper[4669]: I1001 11:44:20.563722 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 11:44:20 crc kubenswrapper[4669]: W1001 11:44:20.569867 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d13ad6e_a577_4f92_95ea_8ad268373774.slice/crio-dee9e90d4d36b9fe254dd49a2aaef21dc526ecef5e4ac6b2ddeb3db6173b8ce0 WatchSource:0}: Error finding container dee9e90d4d36b9fe254dd49a2aaef21dc526ecef5e4ac6b2ddeb3db6173b8ce0: Status 404 returned error can't find the container with id dee9e90d4d36b9fe254dd49a2aaef21dc526ecef5e4ac6b2ddeb3db6173b8ce0 Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.156408 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-d5fz7"] Oct 01 11:44:21 crc kubenswrapper[4669]: W1001 11:44:21.281971 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9e9459_07b3_4f2d_9385_7c41a5bb6edd.slice/crio-cb3894bb51f648b27380f82b9fe94f10c3f278becd2c6976638b4e166d915f41 WatchSource:0}: Error finding container cb3894bb51f648b27380f82b9fe94f10c3f278becd2c6976638b4e166d915f41: Status 404 returned error can't find the container with id cb3894bb51f648b27380f82b9fe94f10c3f278becd2c6976638b4e166d915f41 Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.402867 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d13ad6e-a577-4f92-95ea-8ad268373774","Type":"ContainerStarted","Data":"dee9e90d4d36b9fe254dd49a2aaef21dc526ecef5e4ac6b2ddeb3db6173b8ce0"} Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.405378 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4619f705-9393-48c8-bc69-2d6183546af2","Type":"ContainerStarted","Data":"809ebe8a7f9b3cd52ba8893dd5e5d7f364e22cda0ade7a5b0d6d5c665aced5b6"} Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.407894 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d832718-661a-44fb-bcc8-7f48af908b15","Type":"ContainerStarted","Data":"c2199de0a4468222ad06e6c8509deff730da8185a2e350a8c2d0de967188b3f7"} Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.412886 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" event={"ID":"ca892c75-1ca1-4192-b6da-7ac18e3eba1a","Type":"ContainerDied","Data":"8f0b3ad6863dd027e2cdceacf295da2e6e56f48dd4609ea64d5e18802c41ef55"} Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.412945 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f0b3ad6863dd027e2cdceacf295da2e6e56f48dd4609ea64d5e18802c41ef55" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.415946 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" event={"ID":"6adaa222-a936-47ee-bec6-facc69d10d36","Type":"ContainerDied","Data":"0be579c7ba588400caee7656af56aa7893a4b52d2dab4dbf4104f9deac4809a4"} Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.415986 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0be579c7ba588400caee7656af56aa7893a4b52d2dab4dbf4104f9deac4809a4" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.424420 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d5fz7" event={"ID":"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd","Type":"ContainerStarted","Data":"cb3894bb51f648b27380f82b9fe94f10c3f278becd2c6976638b4e166d915f41"} Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.426642 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.456461 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.554899 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htbt2\" (UniqueName: \"kubernetes.io/projected/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-kube-api-access-htbt2\") pod \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.555010 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adaa222-a936-47ee-bec6-facc69d10d36-config\") pod \"6adaa222-a936-47ee-bec6-facc69d10d36\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.555027 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-config\") pod \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.555069 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-dns-svc\") pod \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\" (UID: \"ca892c75-1ca1-4192-b6da-7ac18e3eba1a\") " Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.555170 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rbf5\" (UniqueName: \"kubernetes.io/projected/6adaa222-a936-47ee-bec6-facc69d10d36-kube-api-access-4rbf5\") pod \"6adaa222-a936-47ee-bec6-facc69d10d36\" (UID: \"6adaa222-a936-47ee-bec6-facc69d10d36\") " Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.556825 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adaa222-a936-47ee-bec6-facc69d10d36-config" (OuterVolumeSpecName: "config") pod "6adaa222-a936-47ee-bec6-facc69d10d36" (UID: "6adaa222-a936-47ee-bec6-facc69d10d36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.557272 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-config" (OuterVolumeSpecName: "config") pod "ca892c75-1ca1-4192-b6da-7ac18e3eba1a" (UID: "ca892c75-1ca1-4192-b6da-7ac18e3eba1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.557457 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca892c75-1ca1-4192-b6da-7ac18e3eba1a" (UID: "ca892c75-1ca1-4192-b6da-7ac18e3eba1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.566418 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-kube-api-access-htbt2" (OuterVolumeSpecName: "kube-api-access-htbt2") pod "ca892c75-1ca1-4192-b6da-7ac18e3eba1a" (UID: "ca892c75-1ca1-4192-b6da-7ac18e3eba1a"). InnerVolumeSpecName "kube-api-access-htbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.566740 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adaa222-a936-47ee-bec6-facc69d10d36-kube-api-access-4rbf5" (OuterVolumeSpecName: "kube-api-access-4rbf5") pod "6adaa222-a936-47ee-bec6-facc69d10d36" (UID: "6adaa222-a936-47ee-bec6-facc69d10d36"). InnerVolumeSpecName "kube-api-access-4rbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.656965 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rbf5\" (UniqueName: \"kubernetes.io/projected/6adaa222-a936-47ee-bec6-facc69d10d36-kube-api-access-4rbf5\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.657003 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htbt2\" (UniqueName: \"kubernetes.io/projected/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-kube-api-access-htbt2\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.657022 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adaa222-a936-47ee-bec6-facc69d10d36-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.657037 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:21 crc kubenswrapper[4669]: I1001 11:44:21.657048 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca892c75-1ca1-4192-b6da-7ac18e3eba1a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.436922 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0dda17c6-d274-4975-8796-deda5fd09e9c","Type":"ContainerStarted","Data":"953f9ead9e32847057bf8dce8dc04bc1c0ec47050f5ee74cef341bde72f4188a"} Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.439724 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.441987 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bb4qz" Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.446344 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"500653c7-d0f6-46d5-9411-60a17569fdd3","Type":"ContainerStarted","Data":"f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c"} Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.446467 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rghs8" Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.495128 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.952946354 podStartE2EDuration="20.495107365s" podCreationTimestamp="2025-10-01 11:44:02 +0000 UTC" firstStartedPulling="2025-10-01 11:44:18.782127582 +0000 UTC m=+949.881692569" lastFinishedPulling="2025-10-01 11:44:21.324288603 +0000 UTC m=+952.423853580" observedRunningTime="2025-10-01 11:44:22.466183652 +0000 UTC m=+953.565748649" watchObservedRunningTime="2025-10-01 11:44:22.495107365 +0000 UTC m=+953.594672342" Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.504149 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rghs8"] Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.509019 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rghs8"] Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.565349 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bb4qz"] Oct 01 11:44:22 crc kubenswrapper[4669]: I1001 11:44:22.571357 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bb4qz"] Oct 01 11:44:23 crc kubenswrapper[4669]: I1001 11:44:23.656843 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adaa222-a936-47ee-bec6-facc69d10d36" path="/var/lib/kubelet/pods/6adaa222-a936-47ee-bec6-facc69d10d36/volumes" Oct 01 11:44:23 crc kubenswrapper[4669]: I1001 11:44:23.658912 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca892c75-1ca1-4192-b6da-7ac18e3eba1a" path="/var/lib/kubelet/pods/ca892c75-1ca1-4192-b6da-7ac18e3eba1a/volumes" Oct 01 11:44:27 crc kubenswrapper[4669]: I1001 11:44:27.834282 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.506462 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj" event={"ID":"c5ffe639-af06-4c4c-8794-a1becff8a692","Type":"ContainerStarted","Data":"8e0dfa16a5fdcff756e0562a34c76d8edf17ae2749231e80cd8574b142f98a53"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.507069 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-plhdj" Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.508316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d5fz7" event={"ID":"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd","Type":"ContainerStarted","Data":"7e0189577ed8b9dd9331aca2c829fd66dbc42866d92df9145af606a0906d7826"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.509825 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d13ad6e-a577-4f92-95ea-8ad268373774","Type":"ContainerStarted","Data":"1e542ffd3558ec7f386cf2fff6ecd4d06ec32b7a077ed0222d718a400b954ecb"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.511331 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d832718-661a-44fb-bcc8-7f48af908b15","Type":"ContainerStarted","Data":"d7e3f25f398c9230aa728260c9f128890931fd534923084cf3e18d565b8c6014"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.511443 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.512830 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"76c8bfa8-2fca-4a74-85e8-f44af35d612f","Type":"ContainerStarted","Data":"0461a3e7a039ae367f32af8440a098eef8af230f2ce3b66ccf610b4ba46d05d6"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.514869 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92bd05a8-df03-4e85-b32a-dc3ced713159","Type":"ContainerStarted","Data":"680438a7e53f7f853e18491815290b3573df7b85442b8a7469eb7b1ad3b5ebfd"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.516568 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"872d79b4-0374-4e78-98e4-32393e2f7f05","Type":"ContainerStarted","Data":"4d71014ddd3116f91933322d474702e4a77bcd01c2c1ce5cdd192f024dd30009"} Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.531344 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-plhdj" podStartSLOduration=13.844705274 podStartE2EDuration="21.531317987s" podCreationTimestamp="2025-10-01 11:44:07 +0000 UTC" firstStartedPulling="2025-10-01 11:44:20.302975963 +0000 UTC m=+951.402540940" lastFinishedPulling="2025-10-01 11:44:27.989588676 +0000 UTC m=+959.089153653" observedRunningTime="2025-10-01 11:44:28.529059882 +0000 UTC m=+959.628624859" watchObservedRunningTime="2025-10-01 11:44:28.531317987 +0000 UTC m=+959.630882964" Oct 01 11:44:28 crc kubenswrapper[4669]: I1001 11:44:28.575551 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.017532739 podStartE2EDuration="24.575532096s" podCreationTimestamp="2025-10-01 11:44:04 +0000 UTC" firstStartedPulling="2025-10-01 11:44:20.54083372 +0000 UTC m=+951.640398697" lastFinishedPulling="2025-10-01 11:44:28.098833077 +0000 UTC m=+959.198398054" observedRunningTime="2025-10-01 11:44:28.573007233 +0000 UTC m=+959.672572220" watchObservedRunningTime="2025-10-01 11:44:28.575532096 +0000 UTC m=+959.675097073" Oct 01 11:44:29 crc kubenswrapper[4669]: I1001 11:44:29.527086 4669 generic.go:334] "Generic (PLEG): container finished" podID="1c9e9459-07b3-4f2d-9385-7c41a5bb6edd" containerID="7e0189577ed8b9dd9331aca2c829fd66dbc42866d92df9145af606a0906d7826" exitCode=0 Oct 01 11:44:29 crc kubenswrapper[4669]: I1001 11:44:29.528458 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d5fz7" event={"ID":"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd","Type":"ContainerDied","Data":"7e0189577ed8b9dd9331aca2c829fd66dbc42866d92df9145af606a0906d7826"} Oct 01 11:44:30 crc kubenswrapper[4669]: I1001 11:44:30.541302 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d5fz7" event={"ID":"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd","Type":"ContainerStarted","Data":"9fc5d8a77233a2c1fa861ddd7c901fb59a6cda617ea4726e5a1fb20089649d9a"} Oct 01 11:44:30 crc kubenswrapper[4669]: I1001 11:44:30.541679 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:30 crc kubenswrapper[4669]: I1001 11:44:30.541697 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d5fz7" event={"ID":"1c9e9459-07b3-4f2d-9385-7c41a5bb6edd","Type":"ContainerStarted","Data":"4e9398172a1a72b73dda26f545f736a342234fc087ccbac6eaa58ff9b393015f"} Oct 01 11:44:30 crc kubenswrapper[4669]: I1001 11:44:30.579566 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-d5fz7" podStartSLOduration=16.875662681 podStartE2EDuration="23.579533614s" podCreationTimestamp="2025-10-01 11:44:07 +0000 UTC" firstStartedPulling="2025-10-01 11:44:21.285070218 +0000 UTC m=+952.384635195" lastFinishedPulling="2025-10-01 11:44:27.988941151 +0000 UTC m=+959.088506128" observedRunningTime="2025-10-01 11:44:30.57004165 +0000 UTC m=+961.669606627" watchObservedRunningTime="2025-10-01 11:44:30.579533614 +0000 UTC m=+961.679098631" Oct 01 11:44:31 crc kubenswrapper[4669]: I1001 11:44:31.553204 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.569622 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"76c8bfa8-2fca-4a74-85e8-f44af35d612f","Type":"ContainerStarted","Data":"7f30ef030708055557c8146fff4ab92da3bef8b9a3adf2eb26d268f7a957e161"} Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.573461 4669 generic.go:334] "Generic (PLEG): container finished" podID="92bd05a8-df03-4e85-b32a-dc3ced713159" containerID="680438a7e53f7f853e18491815290b3573df7b85442b8a7469eb7b1ad3b5ebfd" exitCode=0 Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.573653 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92bd05a8-df03-4e85-b32a-dc3ced713159","Type":"ContainerDied","Data":"680438a7e53f7f853e18491815290b3573df7b85442b8a7469eb7b1ad3b5ebfd"} Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.587263 4669 generic.go:334] "Generic (PLEG): container finished" podID="872d79b4-0374-4e78-98e4-32393e2f7f05" containerID="4d71014ddd3116f91933322d474702e4a77bcd01c2c1ce5cdd192f024dd30009" exitCode=0 Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.587415 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"872d79b4-0374-4e78-98e4-32393e2f7f05","Type":"ContainerDied","Data":"4d71014ddd3116f91933322d474702e4a77bcd01c2c1ce5cdd192f024dd30009"} Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.590249 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d13ad6e-a577-4f92-95ea-8ad268373774","Type":"ContainerStarted","Data":"46917f33fec08dcb3a020b2e690591e7216944be6353cf1998cf62916d090975"} Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.612550 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.35578829 podStartE2EDuration="25.612509337s" podCreationTimestamp="2025-10-01 11:44:07 +0000 UTC" firstStartedPulling="2025-10-01 11:44:20.370726472 +0000 UTC m=+951.470291449" lastFinishedPulling="2025-10-01 11:44:31.627447519 +0000 UTC m=+962.727012496" observedRunningTime="2025-10-01 11:44:32.60452333 +0000 UTC m=+963.704088327" watchObservedRunningTime="2025-10-01 11:44:32.612509337 +0000 UTC m=+963.712074354" Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.670626 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.605796285 podStartE2EDuration="22.670598897s" podCreationTimestamp="2025-10-01 11:44:10 +0000 UTC" firstStartedPulling="2025-10-01 11:44:20.574461778 +0000 UTC m=+951.674026755" lastFinishedPulling="2025-10-01 11:44:31.63926435 +0000 UTC m=+962.738829367" observedRunningTime="2025-10-01 11:44:32.63418052 +0000 UTC m=+963.733745507" watchObservedRunningTime="2025-10-01 11:44:32.670598897 +0000 UTC m=+963.770163874" Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.841014 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:32 crc kubenswrapper[4669]: I1001 11:44:32.888054 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.217368 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.286306 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.605411 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92bd05a8-df03-4e85-b32a-dc3ced713159","Type":"ContainerStarted","Data":"b4e2d4937a77ab0f816eab88385f23a4ab912dfd86ac299361790a69e5464666"} Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.610482 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"872d79b4-0374-4e78-98e4-32393e2f7f05","Type":"ContainerStarted","Data":"7d4e803f856b0e5c23fdea8b836edc0b42c939974817787a022bafbf203ee19d"} Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.611441 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.611492 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.669920 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.974438802999998 podStartE2EDuration="33.669886275s" podCreationTimestamp="2025-10-01 11:44:00 +0000 UTC" firstStartedPulling="2025-10-01 11:44:20.293323285 +0000 UTC m=+951.392888262" lastFinishedPulling="2025-10-01 11:44:27.988770737 +0000 UTC m=+959.088335734" observedRunningTime="2025-10-01 11:44:33.649282637 +0000 UTC m=+964.748847644" watchObservedRunningTime="2025-10-01 11:44:33.669886275 +0000 UTC m=+964.769451292" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.707756 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.707832 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 11:44:33 crc kubenswrapper[4669]: I1001 11:44:33.715358 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.885183755 podStartE2EDuration="33.715330774s" podCreationTimestamp="2025-10-01 11:44:00 +0000 UTC" firstStartedPulling="2025-10-01 11:44:20.158836553 +0000 UTC m=+951.258401530" lastFinishedPulling="2025-10-01 11:44:27.988983572 +0000 UTC m=+959.088548549" observedRunningTime="2025-10-01 11:44:33.690846221 +0000 UTC m=+964.790411238" watchObservedRunningTime="2025-10-01 11:44:33.715330774 +0000 UTC m=+964.814895761" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.050296 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l9zjr"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.130647 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nsrfk"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.131976 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.134506 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.141285 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lbq97"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.143435 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.149330 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.150506 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nsrfk"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.174644 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lbq97"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261149 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-ovn-rundir\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261208 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-ovs-rundir\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261254 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7kgw\" (UniqueName: \"kubernetes.io/projected/29b42a35-31e5-4a3b-bcee-209091e48b9c-kube-api-access-c7kgw\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261281 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-config\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261300 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261325 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-config\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261365 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261391 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261421 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qzxx\" (UniqueName: \"kubernetes.io/projected/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-kube-api-access-6qzxx\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.261440 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-combined-ca-bundle\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.350958 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mlrtj"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365320 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-ovn-rundir\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365387 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-ovs-rundir\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365433 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7kgw\" (UniqueName: \"kubernetes.io/projected/29b42a35-31e5-4a3b-bcee-209091e48b9c-kube-api-access-c7kgw\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-config\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365481 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365504 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-config\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365549 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365574 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365599 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qzxx\" (UniqueName: \"kubernetes.io/projected/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-kube-api-access-6qzxx\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.365617 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-combined-ca-bundle\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.371026 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.371318 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-ovn-rundir\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.371367 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-ovs-rundir\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.372603 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-config\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.374998 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-config\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.375149 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.377307 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-combined-ca-bundle\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.400720 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.424940 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7kgw\" (UniqueName: \"kubernetes.io/projected/29b42a35-31e5-4a3b-bcee-209091e48b9c-kube-api-access-c7kgw\") pod \"dnsmasq-dns-7fd796d7df-lbq97\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.427894 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qzxx\" (UniqueName: \"kubernetes.io/projected/b77a4c9a-0426-40f6-a28a-7b985aebc4a2-kube-api-access-6qzxx\") pod \"ovn-controller-metrics-nsrfk\" (UID: \"b77a4c9a-0426-40f6-a28a-7b985aebc4a2\") " pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.465764 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kt8kt"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.467722 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.478389 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kt8kt"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.478946 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nsrfk" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.484553 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.514253 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.517692 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.547147 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.552487 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.553017 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.553644 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.553989 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5grt4" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.555559 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.587071 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.601946 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.602274 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqqgv\" (UniqueName: \"kubernetes.io/projected/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-kube-api-access-qqqgv\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.602344 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.603136 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.603181 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-config\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706677 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706784 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706819 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-config\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706858 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706895 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83f3ffe1-ac22-408f-ab82-73d5cfd82953-scripts\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706951 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3ffe1-ac22-408f-ab82-73d5cfd82953-config\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.706975 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.707022 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqqgv\" (UniqueName: \"kubernetes.io/projected/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-kube-api-access-qqqgv\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.707057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83f3ffe1-ac22-408f-ab82-73d5cfd82953-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.707097 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62hh\" (UniqueName: \"kubernetes.io/projected/83f3ffe1-ac22-408f-ab82-73d5cfd82953-kube-api-access-n62hh\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.707139 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.707164 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.707844 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.708775 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.710768 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-config\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.711642 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.790708 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqqgv\" (UniqueName: \"kubernetes.io/projected/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-kube-api-access-qqqgv\") pod \"dnsmasq-dns-86db49b7ff-kt8kt\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.810981 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83f3ffe1-ac22-408f-ab82-73d5cfd82953-scripts\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.811141 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3ffe1-ac22-408f-ab82-73d5cfd82953-config\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.811229 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83f3ffe1-ac22-408f-ab82-73d5cfd82953-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.811253 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n62hh\" (UniqueName: \"kubernetes.io/projected/83f3ffe1-ac22-408f-ab82-73d5cfd82953-kube-api-access-n62hh\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.811298 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.811332 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.812823 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3ffe1-ac22-408f-ab82-73d5cfd82953-config\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.827286 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.829726 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lbq97"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.838320 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83f3ffe1-ac22-408f-ab82-73d5cfd82953-scripts\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.838504 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.850398 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83f3ffe1-ac22-408f-ab82-73d5cfd82953-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.868045 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb7fm"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.874456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.874580 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.874771 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f3ffe1-ac22-408f-ab82-73d5cfd82953-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.875444 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62hh\" (UniqueName: \"kubernetes.io/projected/83f3ffe1-ac22-408f-ab82-73d5cfd82953-kube-api-access-n62hh\") pod \"ovn-northd-0\" (UID: \"83f3ffe1-ac22-408f-ab82-73d5cfd82953\") " pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.876652 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.890010 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb7fm"] Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.969959 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkrt\" (UniqueName: \"kubernetes.io/projected/60fd3501-3d20-401c-b46c-ebc2451bf0ce-kube-api-access-nlkrt\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.970061 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-dns-svc\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.970126 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.970156 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.970192 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-config\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.971235 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 11:44:34 crc kubenswrapper[4669]: I1001 11:44:34.998366 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.053853 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.076930 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-dns-svc\") pod \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077115 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-config\") pod \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077154 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8x72\" (UniqueName: \"kubernetes.io/projected/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-kube-api-access-s8x72\") pod \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\" (UID: \"8de1ee14-9826-4903-a1d3-ee0b8c2416c6\") " Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077388 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077425 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077462 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-config\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077500 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkrt\" (UniqueName: \"kubernetes.io/projected/60fd3501-3d20-401c-b46c-ebc2451bf0ce-kube-api-access-nlkrt\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.077548 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-dns-svc\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.078399 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-dns-svc\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.081772 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.082059 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-config" (OuterVolumeSpecName: "config") pod "8de1ee14-9826-4903-a1d3-ee0b8c2416c6" (UID: "8de1ee14-9826-4903-a1d3-ee0b8c2416c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.082096 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.082277 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8de1ee14-9826-4903-a1d3-ee0b8c2416c6" (UID: "8de1ee14-9826-4903-a1d3-ee0b8c2416c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.084052 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-config\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.089493 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-kube-api-access-s8x72" (OuterVolumeSpecName: "kube-api-access-s8x72") pod "8de1ee14-9826-4903-a1d3-ee0b8c2416c6" (UID: "8de1ee14-9826-4903-a1d3-ee0b8c2416c6"). InnerVolumeSpecName "kube-api-access-s8x72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.100458 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkrt\" (UniqueName: \"kubernetes.io/projected/60fd3501-3d20-401c-b46c-ebc2451bf0ce-kube-api-access-nlkrt\") pod \"dnsmasq-dns-698758b865-xb7fm\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.179179 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-config\") pod \"e20229bd-54f8-4a0e-a2d4-42102e32950e\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.179497 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-dns-svc\") pod \"e20229bd-54f8-4a0e-a2d4-42102e32950e\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.179630 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv5sn\" (UniqueName: \"kubernetes.io/projected/e20229bd-54f8-4a0e-a2d4-42102e32950e-kube-api-access-fv5sn\") pod \"e20229bd-54f8-4a0e-a2d4-42102e32950e\" (UID: \"e20229bd-54f8-4a0e-a2d4-42102e32950e\") " Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.180253 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e20229bd-54f8-4a0e-a2d4-42102e32950e" (UID: "e20229bd-54f8-4a0e-a2d4-42102e32950e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.180276 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-config" (OuterVolumeSpecName: "config") pod "e20229bd-54f8-4a0e-a2d4-42102e32950e" (UID: "e20229bd-54f8-4a0e-a2d4-42102e32950e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.180868 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.180995 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.181063 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8x72\" (UniqueName: \"kubernetes.io/projected/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-kube-api-access-s8x72\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.181158 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e20229bd-54f8-4a0e-a2d4-42102e32950e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.181213 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8de1ee14-9826-4903-a1d3-ee0b8c2416c6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.183098 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20229bd-54f8-4a0e-a2d4-42102e32950e-kube-api-access-fv5sn" (OuterVolumeSpecName: "kube-api-access-fv5sn") pod "e20229bd-54f8-4a0e-a2d4-42102e32950e" (UID: "e20229bd-54f8-4a0e-a2d4-42102e32950e"). InnerVolumeSpecName "kube-api-access-fv5sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.271859 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.283240 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv5sn\" (UniqueName: \"kubernetes.io/projected/e20229bd-54f8-4a0e-a2d4-42102e32950e-kube-api-access-fv5sn\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.328259 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nsrfk"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.338035 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lbq97"] Oct 01 11:44:35 crc kubenswrapper[4669]: W1001 11:44:35.362844 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29b42a35_31e5_4a3b_bcee_209091e48b9c.slice/crio-20db7883d2aef422fcfce4eabc9c69fa96af2b921e8eb6628cb8daa4b3224080 WatchSource:0}: Error finding container 20db7883d2aef422fcfce4eabc9c69fa96af2b921e8eb6628cb8daa4b3224080: Status 404 returned error can't find the container with id 20db7883d2aef422fcfce4eabc9c69fa96af2b921e8eb6628cb8daa4b3224080 Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.481406 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 11:44:35 crc kubenswrapper[4669]: W1001 11:44:35.513676 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83f3ffe1_ac22_408f_ab82_73d5cfd82953.slice/crio-dbe1fd528c63101b47df80f33bab1f92ab5b5f0586cc96d95a0d6b416d438f13 WatchSource:0}: Error finding container dbe1fd528c63101b47df80f33bab1f92ab5b5f0586cc96d95a0d6b416d438f13: Status 404 returned error can't find the container with id dbe1fd528c63101b47df80f33bab1f92ab5b5f0586cc96d95a0d6b416d438f13 Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.581732 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kt8kt"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.613221 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb7fm"] Oct 01 11:44:35 crc kubenswrapper[4669]: W1001 11:44:35.630512 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fd3501_3d20_401c_b46c_ebc2451bf0ce.slice/crio-46d71328fb87b3940d665ae81dcec382b801e1421907fb626e54bf9c4f808b32 WatchSource:0}: Error finding container 46d71328fb87b3940d665ae81dcec382b801e1421907fb626e54bf9c4f808b32: Status 404 returned error can't find the container with id 46d71328fb87b3940d665ae81dcec382b801e1421907fb626e54bf9c4f808b32 Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.662155 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" event={"ID":"e20229bd-54f8-4a0e-a2d4-42102e32950e","Type":"ContainerDied","Data":"7b2cf604c60fc1d0d97ec2c7be8a585388ebd4fe20880e6737974081da400127"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.662266 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mlrtj" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.671244 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"83f3ffe1-ac22-408f-ab82-73d5cfd82953","Type":"ContainerStarted","Data":"dbe1fd528c63101b47df80f33bab1f92ab5b5f0586cc96d95a0d6b416d438f13"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.673537 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" event={"ID":"29b42a35-31e5-4a3b-bcee-209091e48b9c","Type":"ContainerStarted","Data":"20db7883d2aef422fcfce4eabc9c69fa96af2b921e8eb6628cb8daa4b3224080"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.680585 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nsrfk" event={"ID":"b77a4c9a-0426-40f6-a28a-7b985aebc4a2","Type":"ContainerStarted","Data":"7eb608c1c77b3e45b83500e1c8627ea12f7788e3fceec821a48b388ade8e8e7f"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.686106 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb7fm" event={"ID":"60fd3501-3d20-401c-b46c-ebc2451bf0ce","Type":"ContainerStarted","Data":"46d71328fb87b3940d665ae81dcec382b801e1421907fb626e54bf9c4f808b32"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.690351 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" event={"ID":"8de1ee14-9826-4903-a1d3-ee0b8c2416c6","Type":"ContainerDied","Data":"dd0ae566256a2d3f23c4c99bddae1842ff62c7a77b29cd061d1abef7409414c4"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.690423 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l9zjr" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.694332 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" event={"ID":"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae","Type":"ContainerStarted","Data":"0ee763bb503f02f591e86330b7886a955f761a4e81a9aa30df25c7f2bfc64431"} Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.751344 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l9zjr"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.765284 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l9zjr"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.786349 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mlrtj"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.792648 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mlrtj"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.936776 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.943162 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.947931 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.948326 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.948495 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.949990 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-92xrc" Oct 01 11:44:35 crc kubenswrapper[4669]: I1001 11:44:35.984844 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.008155 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.008280 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/681d4309-a9a8-4c2c-bf25-4619653187fd-cache\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.008321 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqvk\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-kube-api-access-xcqvk\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.008367 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/681d4309-a9a8-4c2c-bf25-4619653187fd-lock\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.008402 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.110800 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.110907 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/681d4309-a9a8-4c2c-bf25-4619653187fd-cache\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.110941 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqvk\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-kube-api-access-xcqvk\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.110985 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/681d4309-a9a8-4c2c-bf25-4619653187fd-lock\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.111026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: E1001 11:44:36.111097 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 11:44:36 crc kubenswrapper[4669]: E1001 11:44:36.111161 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 11:44:36 crc kubenswrapper[4669]: E1001 11:44:36.111258 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift podName:681d4309-a9a8-4c2c-bf25-4619653187fd nodeName:}" failed. No retries permitted until 2025-10-01 11:44:36.611205242 +0000 UTC m=+967.710770229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift") pod "swift-storage-0" (UID: "681d4309-a9a8-4c2c-bf25-4619653187fd") : configmap "swift-ring-files" not found Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.111410 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.112250 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/681d4309-a9a8-4c2c-bf25-4619653187fd-lock\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.113175 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/681d4309-a9a8-4c2c-bf25-4619653187fd-cache\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.135430 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqvk\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-kube-api-access-xcqvk\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.136342 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.452966 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pw6p8"] Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.456416 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.459602 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.459929 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.463119 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.526749 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pw6p8"] Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.529619 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-ring-data-devices\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.529694 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-etc-swift\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.530129 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-dispersionconf\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.531276 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-scripts\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.531399 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-swiftconf\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.531444 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-combined-ca-bundle\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.531661 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxng\" (UniqueName: \"kubernetes.io/projected/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-kube-api-access-8dxng\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634014 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-ring-data-devices\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634109 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-etc-swift\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634174 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-dispersionconf\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634246 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634276 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-scripts\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634314 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-swiftconf\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634338 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-combined-ca-bundle\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634446 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxng\" (UniqueName: \"kubernetes.io/projected/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-kube-api-access-8dxng\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: E1001 11:44:36.634456 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 11:44:36 crc kubenswrapper[4669]: E1001 11:44:36.634488 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 11:44:36 crc kubenswrapper[4669]: E1001 11:44:36.634552 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift podName:681d4309-a9a8-4c2c-bf25-4619653187fd nodeName:}" failed. No retries permitted until 2025-10-01 11:44:37.6345297 +0000 UTC m=+968.734094677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift") pod "swift-storage-0" (UID: "681d4309-a9a8-4c2c-bf25-4619653187fd") : configmap "swift-ring-files" not found Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634717 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-etc-swift\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.634947 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-ring-data-devices\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.635443 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-scripts\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.639634 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-dispersionconf\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.643697 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-combined-ca-bundle\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.653802 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-swiftconf\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.659599 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxng\" (UniqueName: \"kubernetes.io/projected/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-kube-api-access-8dxng\") pod \"swift-ring-rebalance-pw6p8\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.704808 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nsrfk" event={"ID":"b77a4c9a-0426-40f6-a28a-7b985aebc4a2","Type":"ContainerStarted","Data":"4974f9b61d43438a6249f6a94ed25a7fce37faa1c4fdaf910f65caab35dbfeae"} Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.706890 4669 generic.go:334] "Generic (PLEG): container finished" podID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerID="67676c9744e09cef9d723fc7f389d796be3188532f95e17ddf62d1a8339c4690" exitCode=0 Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.707293 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb7fm" event={"ID":"60fd3501-3d20-401c-b46c-ebc2451bf0ce","Type":"ContainerDied","Data":"67676c9744e09cef9d723fc7f389d796be3188532f95e17ddf62d1a8339c4690"} Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.713718 4669 generic.go:334] "Generic (PLEG): container finished" podID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerID="c3d652ade0539818751b7f45ab94ce4cda0544105acd63f0619bc8e958d3adf4" exitCode=0 Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.713774 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" event={"ID":"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae","Type":"ContainerDied","Data":"c3d652ade0539818751b7f45ab94ce4cda0544105acd63f0619bc8e958d3adf4"} Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.735730 4669 generic.go:334] "Generic (PLEG): container finished" podID="29b42a35-31e5-4a3b-bcee-209091e48b9c" containerID="3088bb931e5e10efb06ede7a965fa0ef62bf4a4a1f7256b28e93b12b559c02c5" exitCode=0 Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.735787 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" event={"ID":"29b42a35-31e5-4a3b-bcee-209091e48b9c","Type":"ContainerDied","Data":"3088bb931e5e10efb06ede7a965fa0ef62bf4a4a1f7256b28e93b12b559c02c5"} Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.760337 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nsrfk" podStartSLOduration=2.760308756 podStartE2EDuration="2.760308756s" podCreationTimestamp="2025-10-01 11:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:44:36.73568486 +0000 UTC m=+967.835249837" watchObservedRunningTime="2025-10-01 11:44:36.760308756 +0000 UTC m=+967.859873733" Oct 01 11:44:36 crc kubenswrapper[4669]: I1001 11:44:36.832096 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.161113 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.245839 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-config\") pod \"29b42a35-31e5-4a3b-bcee-209091e48b9c\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.245935 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-dns-svc\") pod \"29b42a35-31e5-4a3b-bcee-209091e48b9c\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.246223 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7kgw\" (UniqueName: \"kubernetes.io/projected/29b42a35-31e5-4a3b-bcee-209091e48b9c-kube-api-access-c7kgw\") pod \"29b42a35-31e5-4a3b-bcee-209091e48b9c\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.246314 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-ovsdbserver-nb\") pod \"29b42a35-31e5-4a3b-bcee-209091e48b9c\" (UID: \"29b42a35-31e5-4a3b-bcee-209091e48b9c\") " Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.254925 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b42a35-31e5-4a3b-bcee-209091e48b9c-kube-api-access-c7kgw" (OuterVolumeSpecName: "kube-api-access-c7kgw") pod "29b42a35-31e5-4a3b-bcee-209091e48b9c" (UID: "29b42a35-31e5-4a3b-bcee-209091e48b9c"). InnerVolumeSpecName "kube-api-access-c7kgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.274946 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-config" (OuterVolumeSpecName: "config") pod "29b42a35-31e5-4a3b-bcee-209091e48b9c" (UID: "29b42a35-31e5-4a3b-bcee-209091e48b9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.278899 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29b42a35-31e5-4a3b-bcee-209091e48b9c" (UID: "29b42a35-31e5-4a3b-bcee-209091e48b9c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.279236 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29b42a35-31e5-4a3b-bcee-209091e48b9c" (UID: "29b42a35-31e5-4a3b-bcee-209091e48b9c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.349259 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7kgw\" (UniqueName: \"kubernetes.io/projected/29b42a35-31e5-4a3b-bcee-209091e48b9c-kube-api-access-c7kgw\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.349306 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.349320 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.349331 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b42a35-31e5-4a3b-bcee-209091e48b9c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.413754 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pw6p8"] Oct 01 11:44:37 crc kubenswrapper[4669]: W1001 11:44:37.593432 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c77921b_54a6_48fd_a57c_4c14d17bf7d3.slice/crio-db29e9fde87e8f4e905b7cf427df1712f96dc7da931cd4a29d9ba97df89ff1e7 WatchSource:0}: Error finding container db29e9fde87e8f4e905b7cf427df1712f96dc7da931cd4a29d9ba97df89ff1e7: Status 404 returned error can't find the container with id db29e9fde87e8f4e905b7cf427df1712f96dc7da931cd4a29d9ba97df89ff1e7 Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.654735 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:37 crc kubenswrapper[4669]: E1001 11:44:37.655020 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 11:44:37 crc kubenswrapper[4669]: E1001 11:44:37.655034 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 11:44:37 crc kubenswrapper[4669]: E1001 11:44:37.655141 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift podName:681d4309-a9a8-4c2c-bf25-4619653187fd nodeName:}" failed. No retries permitted until 2025-10-01 11:44:39.65506281 +0000 UTC m=+970.754627787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift") pod "swift-storage-0" (UID: "681d4309-a9a8-4c2c-bf25-4619653187fd") : configmap "swift-ring-files" not found Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.658919 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de1ee14-9826-4903-a1d3-ee0b8c2416c6" path="/var/lib/kubelet/pods/8de1ee14-9826-4903-a1d3-ee0b8c2416c6/volumes" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.659316 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20229bd-54f8-4a0e-a2d4-42102e32950e" path="/var/lib/kubelet/pods/e20229bd-54f8-4a0e-a2d4-42102e32950e/volumes" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.751171 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.751163 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-lbq97" event={"ID":"29b42a35-31e5-4a3b-bcee-209091e48b9c","Type":"ContainerDied","Data":"20db7883d2aef422fcfce4eabc9c69fa96af2b921e8eb6628cb8daa4b3224080"} Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.751274 4669 scope.go:117] "RemoveContainer" containerID="3088bb931e5e10efb06ede7a965fa0ef62bf4a4a1f7256b28e93b12b559c02c5" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.756173 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pw6p8" event={"ID":"9c77921b-54a6-48fd-a57c-4c14d17bf7d3","Type":"ContainerStarted","Data":"db29e9fde87e8f4e905b7cf427df1712f96dc7da931cd4a29d9ba97df89ff1e7"} Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.761911 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb7fm" event={"ID":"60fd3501-3d20-401c-b46c-ebc2451bf0ce","Type":"ContainerStarted","Data":"9c022035fbe45c8d1dad8c7e48ba39d1b529f02b6d8a3cb1b94587890a72729f"} Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.762093 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.769225 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" event={"ID":"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae","Type":"ContainerStarted","Data":"1b3ac5ddf125d1e72749603346ef81cd1671bbbc77987d3efac36d3d6101d156"} Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.769314 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.809883 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-xb7fm" podStartSLOduration=3.420097704 podStartE2EDuration="3.809861962s" podCreationTimestamp="2025-10-01 11:44:34 +0000 UTC" firstStartedPulling="2025-10-01 11:44:35.63353542 +0000 UTC m=+966.733100397" lastFinishedPulling="2025-10-01 11:44:36.023299688 +0000 UTC m=+967.122864655" observedRunningTime="2025-10-01 11:44:37.788468056 +0000 UTC m=+968.888033073" watchObservedRunningTime="2025-10-01 11:44:37.809861962 +0000 UTC m=+968.909426939" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.813673 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" podStartSLOduration=3.3327907740000002 podStartE2EDuration="3.813661686s" podCreationTimestamp="2025-10-01 11:44:34 +0000 UTC" firstStartedPulling="2025-10-01 11:44:35.60147937 +0000 UTC m=+966.701044347" lastFinishedPulling="2025-10-01 11:44:36.082350282 +0000 UTC m=+967.181915259" observedRunningTime="2025-10-01 11:44:37.802799118 +0000 UTC m=+968.902364095" watchObservedRunningTime="2025-10-01 11:44:37.813661686 +0000 UTC m=+968.913226663" Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.845244 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lbq97"] Oct 01 11:44:37 crc kubenswrapper[4669]: I1001 11:44:37.855743 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lbq97"] Oct 01 11:44:38 crc kubenswrapper[4669]: I1001 11:44:38.794055 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"83f3ffe1-ac22-408f-ab82-73d5cfd82953","Type":"ContainerStarted","Data":"f67f84cfb2a5ba131dac2fcddcf3d0e3097e29821c63c8ca9208bcd82f93ee4f"} Oct 01 11:44:38 crc kubenswrapper[4669]: I1001 11:44:38.794521 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 11:44:38 crc kubenswrapper[4669]: I1001 11:44:38.794539 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"83f3ffe1-ac22-408f-ab82-73d5cfd82953","Type":"ContainerStarted","Data":"a4b7853a5203b3c0923408471a32d1e6890294536d217b5380fef5d789c265c9"} Oct 01 11:44:38 crc kubenswrapper[4669]: I1001 11:44:38.824441 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.701627262 podStartE2EDuration="4.824410486s" podCreationTimestamp="2025-10-01 11:44:34 +0000 UTC" firstStartedPulling="2025-10-01 11:44:35.518275391 +0000 UTC m=+966.617840358" lastFinishedPulling="2025-10-01 11:44:37.641058605 +0000 UTC m=+968.740623582" observedRunningTime="2025-10-01 11:44:38.819798532 +0000 UTC m=+969.919363509" watchObservedRunningTime="2025-10-01 11:44:38.824410486 +0000 UTC m=+969.923975463" Oct 01 11:44:39 crc kubenswrapper[4669]: I1001 11:44:39.657987 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b42a35-31e5-4a3b-bcee-209091e48b9c" path="/var/lib/kubelet/pods/29b42a35-31e5-4a3b-bcee-209091e48b9c/volumes" Oct 01 11:44:39 crc kubenswrapper[4669]: I1001 11:44:39.715186 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:39 crc kubenswrapper[4669]: E1001 11:44:39.715858 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 11:44:39 crc kubenswrapper[4669]: E1001 11:44:39.716029 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 11:44:39 crc kubenswrapper[4669]: E1001 11:44:39.716160 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift podName:681d4309-a9a8-4c2c-bf25-4619653187fd nodeName:}" failed. No retries permitted until 2025-10-01 11:44:43.716133954 +0000 UTC m=+974.815698931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift") pod "swift-storage-0" (UID: "681d4309-a9a8-4c2c-bf25-4619653187fd") : configmap "swift-ring-files" not found Oct 01 11:44:41 crc kubenswrapper[4669]: I1001 11:44:41.838667 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pw6p8" event={"ID":"9c77921b-54a6-48fd-a57c-4c14d17bf7d3","Type":"ContainerStarted","Data":"fb47b02f8b05af66931d42675c1c447324c1c26bb7f95b39616d75ba1e31ee75"} Oct 01 11:44:41 crc kubenswrapper[4669]: I1001 11:44:41.844652 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 11:44:41 crc kubenswrapper[4669]: I1001 11:44:41.844701 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 11:44:41 crc kubenswrapper[4669]: I1001 11:44:41.873327 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pw6p8" podStartSLOduration=2.482037041 podStartE2EDuration="5.873304298s" podCreationTimestamp="2025-10-01 11:44:36 +0000 UTC" firstStartedPulling="2025-10-01 11:44:37.598347784 +0000 UTC m=+968.697912761" lastFinishedPulling="2025-10-01 11:44:40.989615031 +0000 UTC m=+972.089180018" observedRunningTime="2025-10-01 11:44:41.861332436 +0000 UTC m=+972.960897453" watchObservedRunningTime="2025-10-01 11:44:41.873304298 +0000 UTC m=+972.972869275" Oct 01 11:44:41 crc kubenswrapper[4669]: I1001 11:44:41.910014 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 11:44:42 crc kubenswrapper[4669]: I1001 11:44:42.275200 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:42 crc kubenswrapper[4669]: I1001 11:44:42.275629 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:42 crc kubenswrapper[4669]: I1001 11:44:42.354282 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:42 crc kubenswrapper[4669]: I1001 11:44:42.924593 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 11:44:42 crc kubenswrapper[4669]: I1001 11:44:42.927116 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 11:44:43 crc kubenswrapper[4669]: I1001 11:44:43.721610 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:43 crc kubenswrapper[4669]: E1001 11:44:43.722016 4669 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 11:44:43 crc kubenswrapper[4669]: E1001 11:44:43.722111 4669 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 11:44:43 crc kubenswrapper[4669]: E1001 11:44:43.722225 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift podName:681d4309-a9a8-4c2c-bf25-4619653187fd nodeName:}" failed. No retries permitted until 2025-10-01 11:44:51.72219132 +0000 UTC m=+982.821756327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift") pod "swift-storage-0" (UID: "681d4309-a9a8-4c2c-bf25-4619653187fd") : configmap "swift-ring-files" not found Oct 01 11:44:44 crc kubenswrapper[4669]: I1001 11:44:44.831312 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:45 crc kubenswrapper[4669]: I1001 11:44:45.274265 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:44:45 crc kubenswrapper[4669]: I1001 11:44:45.347478 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kt8kt"] Oct 01 11:44:45 crc kubenswrapper[4669]: I1001 11:44:45.350423 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerName="dnsmasq-dns" containerID="cri-o://1b3ac5ddf125d1e72749603346ef81cd1671bbbc77987d3efac36d3d6101d156" gracePeriod=10 Oct 01 11:44:45 crc kubenswrapper[4669]: I1001 11:44:45.876950 4669 generic.go:334] "Generic (PLEG): container finished" podID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerID="1b3ac5ddf125d1e72749603346ef81cd1671bbbc77987d3efac36d3d6101d156" exitCode=0 Oct 01 11:44:45 crc kubenswrapper[4669]: I1001 11:44:45.877333 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" event={"ID":"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae","Type":"ContainerDied","Data":"1b3ac5ddf125d1e72749603346ef81cd1671bbbc77987d3efac36d3d6101d156"} Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.481224 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.605413 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-sb\") pod \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.605480 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-config\") pod \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.605634 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-dns-svc\") pod \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.605731 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqqgv\" (UniqueName: \"kubernetes.io/projected/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-kube-api-access-qqqgv\") pod \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.606160 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-nb\") pod \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\" (UID: \"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae\") " Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.611923 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-kube-api-access-qqqgv" (OuterVolumeSpecName: "kube-api-access-qqqgv") pod "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" (UID: "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae"). InnerVolumeSpecName "kube-api-access-qqqgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.650197 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-config" (OuterVolumeSpecName: "config") pod "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" (UID: "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.652945 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" (UID: "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.655131 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" (UID: "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.664635 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" (UID: "e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.708018 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.708055 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.708064 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.708088 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqqgv\" (UniqueName: \"kubernetes.io/projected/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-kube-api-access-qqqgv\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.708099 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.889589 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" event={"ID":"e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae","Type":"ContainerDied","Data":"0ee763bb503f02f591e86330b7886a955f761a4e81a9aa30df25c7f2bfc64431"} Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.889682 4669 scope.go:117] "RemoveContainer" containerID="1b3ac5ddf125d1e72749603346ef81cd1671bbbc77987d3efac36d3d6101d156" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.889888 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kt8kt" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.922185 4669 scope.go:117] "RemoveContainer" containerID="c3d652ade0539818751b7f45ab94ce4cda0544105acd63f0619bc8e958d3adf4" Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.947730 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kt8kt"] Oct 01 11:44:46 crc kubenswrapper[4669]: I1001 11:44:46.958211 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kt8kt"] Oct 01 11:44:47 crc kubenswrapper[4669]: I1001 11:44:47.672271 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" path="/var/lib/kubelet/pods/e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae/volumes" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.151897 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5s8rb"] Oct 01 11:44:48 crc kubenswrapper[4669]: E1001 11:44:48.152361 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerName="dnsmasq-dns" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.152376 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerName="dnsmasq-dns" Oct 01 11:44:48 crc kubenswrapper[4669]: E1001 11:44:48.152407 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerName="init" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.152414 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerName="init" Oct 01 11:44:48 crc kubenswrapper[4669]: E1001 11:44:48.152429 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b42a35-31e5-4a3b-bcee-209091e48b9c" containerName="init" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.152435 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b42a35-31e5-4a3b-bcee-209091e48b9c" containerName="init" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.152579 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7cbdb92-cc6f-458b-98d5-95b2aac5a2ae" containerName="dnsmasq-dns" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.152600 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b42a35-31e5-4a3b-bcee-209091e48b9c" containerName="init" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.153191 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.167260 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5s8rb"] Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.244932 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmvtb\" (UniqueName: \"kubernetes.io/projected/02cd23a9-752d-439c-9630-94967cef4a4f-kube-api-access-pmvtb\") pod \"glance-db-create-5s8rb\" (UID: \"02cd23a9-752d-439c-9630-94967cef4a4f\") " pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.348048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmvtb\" (UniqueName: \"kubernetes.io/projected/02cd23a9-752d-439c-9630-94967cef4a4f-kube-api-access-pmvtb\") pod \"glance-db-create-5s8rb\" (UID: \"02cd23a9-752d-439c-9630-94967cef4a4f\") " pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.381000 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmvtb\" (UniqueName: \"kubernetes.io/projected/02cd23a9-752d-439c-9630-94967cef4a4f-kube-api-access-pmvtb\") pod \"glance-db-create-5s8rb\" (UID: \"02cd23a9-752d-439c-9630-94967cef4a4f\") " pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.538731 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.925144 4669 generic.go:334] "Generic (PLEG): container finished" podID="9c77921b-54a6-48fd-a57c-4c14d17bf7d3" containerID="fb47b02f8b05af66931d42675c1c447324c1c26bb7f95b39616d75ba1e31ee75" exitCode=0 Oct 01 11:44:48 crc kubenswrapper[4669]: I1001 11:44:48.925239 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pw6p8" event={"ID":"9c77921b-54a6-48fd-a57c-4c14d17bf7d3","Type":"ContainerDied","Data":"fb47b02f8b05af66931d42675c1c447324c1c26bb7f95b39616d75ba1e31ee75"} Oct 01 11:44:49 crc kubenswrapper[4669]: I1001 11:44:49.031344 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5s8rb"] Oct 01 11:44:49 crc kubenswrapper[4669]: I1001 11:44:49.939194 4669 generic.go:334] "Generic (PLEG): container finished" podID="02cd23a9-752d-439c-9630-94967cef4a4f" containerID="47c2c32808c1d5ac1efdf38d9c27b318d984ef8a5eb4378fb124006bd1d7be47" exitCode=0 Oct 01 11:44:49 crc kubenswrapper[4669]: I1001 11:44:49.939304 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5s8rb" event={"ID":"02cd23a9-752d-439c-9630-94967cef4a4f","Type":"ContainerDied","Data":"47c2c32808c1d5ac1efdf38d9c27b318d984ef8a5eb4378fb124006bd1d7be47"} Oct 01 11:44:49 crc kubenswrapper[4669]: I1001 11:44:49.939655 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5s8rb" event={"ID":"02cd23a9-752d-439c-9630-94967cef4a4f","Type":"ContainerStarted","Data":"03eee637ef669c8a35aa7076b21569ad11f559050ae1d0c4f621028a137fabf1"} Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.077689 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.304478 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.402515 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-etc-swift\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.402977 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-ring-data-devices\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.403417 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-dispersionconf\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.403609 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-combined-ca-bundle\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.403691 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.403726 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-swiftconf\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.403844 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-scripts\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.403960 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dxng\" (UniqueName: \"kubernetes.io/projected/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-kube-api-access-8dxng\") pod \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\" (UID: \"9c77921b-54a6-48fd-a57c-4c14d17bf7d3\") " Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.404906 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.405785 4669 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.405856 4669 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.410889 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-kube-api-access-8dxng" (OuterVolumeSpecName: "kube-api-access-8dxng") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "kube-api-access-8dxng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.419787 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.429126 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-scripts" (OuterVolumeSpecName: "scripts") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.437975 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.439775 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9c77921b-54a6-48fd-a57c-4c14d17bf7d3" (UID: "9c77921b-54a6-48fd-a57c-4c14d17bf7d3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.507695 4669 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.507736 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.507747 4669 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.507756 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.507769 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dxng\" (UniqueName: \"kubernetes.io/projected/9c77921b-54a6-48fd-a57c-4c14d17bf7d3-kube-api-access-8dxng\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.979768 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pw6p8" Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.980446 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pw6p8" event={"ID":"9c77921b-54a6-48fd-a57c-4c14d17bf7d3","Type":"ContainerDied","Data":"db29e9fde87e8f4e905b7cf427df1712f96dc7da931cd4a29d9ba97df89ff1e7"} Oct 01 11:44:50 crc kubenswrapper[4669]: I1001 11:44:50.980527 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db29e9fde87e8f4e905b7cf427df1712f96dc7da931cd4a29d9ba97df89ff1e7" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.383958 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.464443 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmvtb\" (UniqueName: \"kubernetes.io/projected/02cd23a9-752d-439c-9630-94967cef4a4f-kube-api-access-pmvtb\") pod \"02cd23a9-752d-439c-9630-94967cef4a4f\" (UID: \"02cd23a9-752d-439c-9630-94967cef4a4f\") " Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.479697 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cd23a9-752d-439c-9630-94967cef4a4f-kube-api-access-pmvtb" (OuterVolumeSpecName: "kube-api-access-pmvtb") pod "02cd23a9-752d-439c-9630-94967cef4a4f" (UID: "02cd23a9-752d-439c-9630-94967cef4a4f"). InnerVolumeSpecName "kube-api-access-pmvtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.569681 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmvtb\" (UniqueName: \"kubernetes.io/projected/02cd23a9-752d-439c-9630-94967cef4a4f-kube-api-access-pmvtb\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.788821 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.794877 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/681d4309-a9a8-4c2c-bf25-4619653187fd-etc-swift\") pod \"swift-storage-0\" (UID: \"681d4309-a9a8-4c2c-bf25-4619653187fd\") " pod="openstack/swift-storage-0" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.866330 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.997193 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5s8rb" event={"ID":"02cd23a9-752d-439c-9630-94967cef4a4f","Type":"ContainerDied","Data":"03eee637ef669c8a35aa7076b21569ad11f559050ae1d0c4f621028a137fabf1"} Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.997671 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03eee637ef669c8a35aa7076b21569ad11f559050ae1d0c4f621028a137fabf1" Oct 01 11:44:51 crc kubenswrapper[4669]: I1001 11:44:51.997580 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5s8rb" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.257511 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 11:44:52 crc kubenswrapper[4669]: W1001 11:44:52.264947 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod681d4309_a9a8_4c2c_bf25_4619653187fd.slice/crio-6aeda7eddd5a5c3b27446fcc0cb8980330c9744f872beb3f633de53e70f19182 WatchSource:0}: Error finding container 6aeda7eddd5a5c3b27446fcc0cb8980330c9744f872beb3f633de53e70f19182: Status 404 returned error can't find the container with id 6aeda7eddd5a5c3b27446fcc0cb8980330c9744f872beb3f633de53e70f19182 Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.392933 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-spqkd"] Oct 01 11:44:52 crc kubenswrapper[4669]: E1001 11:44:52.393571 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cd23a9-752d-439c-9630-94967cef4a4f" containerName="mariadb-database-create" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.393596 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cd23a9-752d-439c-9630-94967cef4a4f" containerName="mariadb-database-create" Oct 01 11:44:52 crc kubenswrapper[4669]: E1001 11:44:52.393625 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77921b-54a6-48fd-a57c-4c14d17bf7d3" containerName="swift-ring-rebalance" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.393634 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77921b-54a6-48fd-a57c-4c14d17bf7d3" containerName="swift-ring-rebalance" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.393830 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c77921b-54a6-48fd-a57c-4c14d17bf7d3" containerName="swift-ring-rebalance" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.393852 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cd23a9-752d-439c-9630-94967cef4a4f" containerName="mariadb-database-create" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.394560 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.414853 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-spqkd"] Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.502456 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhklg\" (UniqueName: \"kubernetes.io/projected/3362993f-ccb2-4f32-935f-8fd98745982e-kube-api-access-rhklg\") pod \"keystone-db-create-spqkd\" (UID: \"3362993f-ccb2-4f32-935f-8fd98745982e\") " pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.605023 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhklg\" (UniqueName: \"kubernetes.io/projected/3362993f-ccb2-4f32-935f-8fd98745982e-kube-api-access-rhklg\") pod \"keystone-db-create-spqkd\" (UID: \"3362993f-ccb2-4f32-935f-8fd98745982e\") " pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.631131 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhklg\" (UniqueName: \"kubernetes.io/projected/3362993f-ccb2-4f32-935f-8fd98745982e-kube-api-access-rhklg\") pod \"keystone-db-create-spqkd\" (UID: \"3362993f-ccb2-4f32-935f-8fd98745982e\") " pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.693021 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-f24vb"] Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.694927 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f24vb" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.704666 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f24vb"] Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.728035 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.808824 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j8cg\" (UniqueName: \"kubernetes.io/projected/5511e00f-0d66-4990-a6b2-7223178d2806-kube-api-access-8j8cg\") pod \"placement-db-create-f24vb\" (UID: \"5511e00f-0d66-4990-a6b2-7223178d2806\") " pod="openstack/placement-db-create-f24vb" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.910955 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j8cg\" (UniqueName: \"kubernetes.io/projected/5511e00f-0d66-4990-a6b2-7223178d2806-kube-api-access-8j8cg\") pod \"placement-db-create-f24vb\" (UID: \"5511e00f-0d66-4990-a6b2-7223178d2806\") " pod="openstack/placement-db-create-f24vb" Oct 01 11:44:52 crc kubenswrapper[4669]: I1001 11:44:52.933827 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j8cg\" (UniqueName: \"kubernetes.io/projected/5511e00f-0d66-4990-a6b2-7223178d2806-kube-api-access-8j8cg\") pod \"placement-db-create-f24vb\" (UID: \"5511e00f-0d66-4990-a6b2-7223178d2806\") " pod="openstack/placement-db-create-f24vb" Oct 01 11:44:53 crc kubenswrapper[4669]: I1001 11:44:53.016622 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"6aeda7eddd5a5c3b27446fcc0cb8980330c9744f872beb3f633de53e70f19182"} Oct 01 11:44:53 crc kubenswrapper[4669]: I1001 11:44:53.022388 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-spqkd"] Oct 01 11:44:53 crc kubenswrapper[4669]: W1001 11:44:53.029407 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3362993f_ccb2_4f32_935f_8fd98745982e.slice/crio-17f1cf94fa4d8275559542888f19ebb34b21c8408ffb68aee72f7c47436bd904 WatchSource:0}: Error finding container 17f1cf94fa4d8275559542888f19ebb34b21c8408ffb68aee72f7c47436bd904: Status 404 returned error can't find the container with id 17f1cf94fa4d8275559542888f19ebb34b21c8408ffb68aee72f7c47436bd904 Oct 01 11:44:53 crc kubenswrapper[4669]: I1001 11:44:53.032594 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f24vb" Oct 01 11:44:53 crc kubenswrapper[4669]: I1001 11:44:53.526391 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f24vb"] Oct 01 11:44:53 crc kubenswrapper[4669]: W1001 11:44:53.713855 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5511e00f_0d66_4990_a6b2_7223178d2806.slice/crio-6a8c98d53a000f38613794401cb9d5c9a32f1b51faa035da772b587917364599 WatchSource:0}: Error finding container 6a8c98d53a000f38613794401cb9d5c9a32f1b51faa035da772b587917364599: Status 404 returned error can't find the container with id 6a8c98d53a000f38613794401cb9d5c9a32f1b51faa035da772b587917364599 Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.026983 4669 generic.go:334] "Generic (PLEG): container finished" podID="3362993f-ccb2-4f32-935f-8fd98745982e" containerID="3774d6df56756e5ff37df2699f23239a4813c9f5a4f6ab7863436d871ee0cef5" exitCode=0 Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.027108 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-spqkd" event={"ID":"3362993f-ccb2-4f32-935f-8fd98745982e","Type":"ContainerDied","Data":"3774d6df56756e5ff37df2699f23239a4813c9f5a4f6ab7863436d871ee0cef5"} Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.027152 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-spqkd" event={"ID":"3362993f-ccb2-4f32-935f-8fd98745982e","Type":"ContainerStarted","Data":"17f1cf94fa4d8275559542888f19ebb34b21c8408ffb68aee72f7c47436bd904"} Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.029889 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"e6ef950d76e6c9434d0168ed86d0cb5d32ca2b0f1d006563ae919928586d35ef"} Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.032188 4669 generic.go:334] "Generic (PLEG): container finished" podID="4619f705-9393-48c8-bc69-2d6183546af2" containerID="809ebe8a7f9b3cd52ba8893dd5e5d7f364e22cda0ade7a5b0d6d5c665aced5b6" exitCode=0 Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.032294 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4619f705-9393-48c8-bc69-2d6183546af2","Type":"ContainerDied","Data":"809ebe8a7f9b3cd52ba8893dd5e5d7f364e22cda0ade7a5b0d6d5c665aced5b6"} Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.035808 4669 generic.go:334] "Generic (PLEG): container finished" podID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerID="f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c" exitCode=0 Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.035888 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"500653c7-d0f6-46d5-9411-60a17569fdd3","Type":"ContainerDied","Data":"f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c"} Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.039192 4669 generic.go:334] "Generic (PLEG): container finished" podID="5511e00f-0d66-4990-a6b2-7223178d2806" containerID="45a93ebfaefd73fbf00378cd17f3e531c19a7d701114b2f94b4e28dea624906d" exitCode=0 Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.039230 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f24vb" event={"ID":"5511e00f-0d66-4990-a6b2-7223178d2806","Type":"ContainerDied","Data":"45a93ebfaefd73fbf00378cd17f3e531c19a7d701114b2f94b4e28dea624906d"} Oct 01 11:44:54 crc kubenswrapper[4669]: I1001 11:44:54.039252 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f24vb" event={"ID":"5511e00f-0d66-4990-a6b2-7223178d2806","Type":"ContainerStarted","Data":"6a8c98d53a000f38613794401cb9d5c9a32f1b51faa035da772b587917364599"} Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.055250 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"500653c7-d0f6-46d5-9411-60a17569fdd3","Type":"ContainerStarted","Data":"4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9"} Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.056144 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.060761 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"e3391d62ac4f9a73df6aa31f1d4c0f64a7276c9dd40bf3a57020ffe7537af4d0"} Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.060871 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"9d721307930736a2ad5702bf18e94102e78fa0c5b3f06a33b41355784f90e60b"} Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.060895 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"07c48a5cb159ccb778b0b4273ea142281b0104d3381a8b1cf4f95cdecc4e837b"} Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.063933 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4619f705-9393-48c8-bc69-2d6183546af2","Type":"ContainerStarted","Data":"b5c767a1b33375c7e3b5684bfa31b9b72e153b30e7bbe663710482868411b6fa"} Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.096166 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.43875229 podStartE2EDuration="57.096141884s" podCreationTimestamp="2025-10-01 11:43:58 +0000 UTC" firstStartedPulling="2025-10-01 11:44:00.035130616 +0000 UTC m=+931.134695593" lastFinishedPulling="2025-10-01 11:44:19.6925202 +0000 UTC m=+950.792085187" observedRunningTime="2025-10-01 11:44:55.082177803 +0000 UTC m=+986.181742830" watchObservedRunningTime="2025-10-01 11:44:55.096141884 +0000 UTC m=+986.195706881" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.116971 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.14280151 podStartE2EDuration="57.116946501s" podCreationTimestamp="2025-10-01 11:43:58 +0000 UTC" firstStartedPulling="2025-10-01 11:44:00.639508218 +0000 UTC m=+931.739073195" lastFinishedPulling="2025-10-01 11:44:19.613653179 +0000 UTC m=+950.713218186" observedRunningTime="2025-10-01 11:44:55.110226947 +0000 UTC m=+986.209791944" watchObservedRunningTime="2025-10-01 11:44:55.116946501 +0000 UTC m=+986.216511488" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.549625 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f24vb" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.553863 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.675597 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j8cg\" (UniqueName: \"kubernetes.io/projected/5511e00f-0d66-4990-a6b2-7223178d2806-kube-api-access-8j8cg\") pod \"5511e00f-0d66-4990-a6b2-7223178d2806\" (UID: \"5511e00f-0d66-4990-a6b2-7223178d2806\") " Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.675864 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhklg\" (UniqueName: \"kubernetes.io/projected/3362993f-ccb2-4f32-935f-8fd98745982e-kube-api-access-rhklg\") pod \"3362993f-ccb2-4f32-935f-8fd98745982e\" (UID: \"3362993f-ccb2-4f32-935f-8fd98745982e\") " Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.684327 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5511e00f-0d66-4990-a6b2-7223178d2806-kube-api-access-8j8cg" (OuterVolumeSpecName: "kube-api-access-8j8cg") pod "5511e00f-0d66-4990-a6b2-7223178d2806" (UID: "5511e00f-0d66-4990-a6b2-7223178d2806"). InnerVolumeSpecName "kube-api-access-8j8cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.685411 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3362993f-ccb2-4f32-935f-8fd98745982e-kube-api-access-rhklg" (OuterVolumeSpecName: "kube-api-access-rhklg") pod "3362993f-ccb2-4f32-935f-8fd98745982e" (UID: "3362993f-ccb2-4f32-935f-8fd98745982e"). InnerVolumeSpecName "kube-api-access-rhklg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.779943 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j8cg\" (UniqueName: \"kubernetes.io/projected/5511e00f-0d66-4990-a6b2-7223178d2806-kube-api-access-8j8cg\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:55 crc kubenswrapper[4669]: I1001 11:44:55.780004 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhklg\" (UniqueName: \"kubernetes.io/projected/3362993f-ccb2-4f32-935f-8fd98745982e-kube-api-access-rhklg\") on node \"crc\" DevicePath \"\"" Oct 01 11:44:56 crc kubenswrapper[4669]: I1001 11:44:56.074339 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-spqkd" Oct 01 11:44:56 crc kubenswrapper[4669]: I1001 11:44:56.074352 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-spqkd" event={"ID":"3362993f-ccb2-4f32-935f-8fd98745982e","Type":"ContainerDied","Data":"17f1cf94fa4d8275559542888f19ebb34b21c8408ffb68aee72f7c47436bd904"} Oct 01 11:44:56 crc kubenswrapper[4669]: I1001 11:44:56.074449 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f1cf94fa4d8275559542888f19ebb34b21c8408ffb68aee72f7c47436bd904" Oct 01 11:44:56 crc kubenswrapper[4669]: I1001 11:44:56.076701 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f24vb" Oct 01 11:44:56 crc kubenswrapper[4669]: I1001 11:44:56.076728 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f24vb" event={"ID":"5511e00f-0d66-4990-a6b2-7223178d2806","Type":"ContainerDied","Data":"6a8c98d53a000f38613794401cb9d5c9a32f1b51faa035da772b587917364599"} Oct 01 11:44:56 crc kubenswrapper[4669]: I1001 11:44:56.076823 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a8c98d53a000f38613794401cb9d5c9a32f1b51faa035da772b587917364599" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.172976 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5df9-account-create-j88kr"] Oct 01 11:44:58 crc kubenswrapper[4669]: E1001 11:44:58.174143 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3362993f-ccb2-4f32-935f-8fd98745982e" containerName="mariadb-database-create" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.174163 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3362993f-ccb2-4f32-935f-8fd98745982e" containerName="mariadb-database-create" Oct 01 11:44:58 crc kubenswrapper[4669]: E1001 11:44:58.174181 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5511e00f-0d66-4990-a6b2-7223178d2806" containerName="mariadb-database-create" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.174189 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="5511e00f-0d66-4990-a6b2-7223178d2806" containerName="mariadb-database-create" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.174413 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="5511e00f-0d66-4990-a6b2-7223178d2806" containerName="mariadb-database-create" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.174447 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3362993f-ccb2-4f32-935f-8fd98745982e" containerName="mariadb-database-create" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.175208 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.180328 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.216161 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5df9-account-create-j88kr"] Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.230274 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5zk\" (UniqueName: \"kubernetes.io/projected/6ce18eda-e988-4897-bd2f-8e656c93b271-kube-api-access-gw5zk\") pod \"glance-5df9-account-create-j88kr\" (UID: \"6ce18eda-e988-4897-bd2f-8e656c93b271\") " pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.332168 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5zk\" (UniqueName: \"kubernetes.io/projected/6ce18eda-e988-4897-bd2f-8e656c93b271-kube-api-access-gw5zk\") pod \"glance-5df9-account-create-j88kr\" (UID: \"6ce18eda-e988-4897-bd2f-8e656c93b271\") " pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.360159 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5zk\" (UniqueName: \"kubernetes.io/projected/6ce18eda-e988-4897-bd2f-8e656c93b271-kube-api-access-gw5zk\") pod \"glance-5df9-account-create-j88kr\" (UID: \"6ce18eda-e988-4897-bd2f-8e656c93b271\") " pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:44:58 crc kubenswrapper[4669]: I1001 11:44:58.549735 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:44:59 crc kubenswrapper[4669]: I1001 11:44:59.074967 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5df9-account-create-j88kr"] Oct 01 11:44:59 crc kubenswrapper[4669]: I1001 11:44:59.111851 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5df9-account-create-j88kr" event={"ID":"6ce18eda-e988-4897-bd2f-8e656c93b271","Type":"ContainerStarted","Data":"a8dba6dfec5c275228fd110678c32f4b4c81acad5443b1627de7eeef20aec9a7"} Oct 01 11:44:59 crc kubenswrapper[4669]: I1001 11:44:59.119660 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"911b8b237c22e7b5989cbd50cca7e0262a0feff43962e5610107edbd5d038a36"} Oct 01 11:44:59 crc kubenswrapper[4669]: I1001 11:44:59.119721 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"79167856ff9d79122e51cb3d0b3da639b4980bffc2136e205ffe64e295b0e009"} Oct 01 11:44:59 crc kubenswrapper[4669]: I1001 11:44:59.119732 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"0ae7d5027d532217bede00490aa04d6f8efc99a823f302e0b6d92770f55175b1"} Oct 01 11:44:59 crc kubenswrapper[4669]: I1001 11:44:59.119743 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"eca4559f718da6fd89ee3d0e042ee6f7a4fa217ee42702c5ff73cdfe8da98eaa"} Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.131831 4669 generic.go:334] "Generic (PLEG): container finished" podID="6ce18eda-e988-4897-bd2f-8e656c93b271" containerID="979c5a5080442e14529cd058c181b2ec8bedca90c7e9b585e65c10266945e3e2" exitCode=0 Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.131962 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5df9-account-create-j88kr" event={"ID":"6ce18eda-e988-4897-bd2f-8e656c93b271","Type":"ContainerDied","Data":"979c5a5080442e14529cd058c181b2ec8bedca90c7e9b585e65c10266945e3e2"} Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.144909 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.151958 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl"] Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.153708 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.156820 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.156844 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.159752 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl"] Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.270591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-config-volume\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.270712 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-secret-volume\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.270869 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzs8n\" (UniqueName: \"kubernetes.io/projected/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-kube-api-access-mzs8n\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.372774 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzs8n\" (UniqueName: \"kubernetes.io/projected/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-kube-api-access-mzs8n\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.372887 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-config-volume\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.372942 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-secret-volume\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.375002 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-config-volume\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.379055 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-secret-volume\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.397178 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzs8n\" (UniqueName: \"kubernetes.io/projected/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-kube-api-access-mzs8n\") pod \"collect-profiles-29321985-zsfwl\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.477973 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:00 crc kubenswrapper[4669]: I1001 11:45:00.937150 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl"] Oct 01 11:45:00 crc kubenswrapper[4669]: W1001 11:45:00.946369 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58dcf4ef_bc6a_4b6b_a976_370b66cc762c.slice/crio-8dfb4ad1ffa67b13dd3df00ea0a9f00e5d67555087de278b4d4fdc207e996fc0 WatchSource:0}: Error finding container 8dfb4ad1ffa67b13dd3df00ea0a9f00e5d67555087de278b4d4fdc207e996fc0: Status 404 returned error can't find the container with id 8dfb4ad1ffa67b13dd3df00ea0a9f00e5d67555087de278b4d4fdc207e996fc0 Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.143205 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" event={"ID":"58dcf4ef-bc6a-4b6b-a976-370b66cc762c","Type":"ContainerStarted","Data":"2bb432865e250c0206120efca081e75961ca357d25d2529f5ae2670db2ef1c14"} Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.144221 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" event={"ID":"58dcf4ef-bc6a-4b6b-a976-370b66cc762c","Type":"ContainerStarted","Data":"8dfb4ad1ffa67b13dd3df00ea0a9f00e5d67555087de278b4d4fdc207e996fc0"} Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.150224 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"f424bdd50a1f189c133c05256b06afa59827b81a4f948ff38f618fca203ff016"} Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.150290 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"149472b9a5592aa47787bb4620d324e3530c5e983fb2cec7f7aa75d03b5edc6f"} Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.150306 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"37c998bd28b8c3c60513a1bb98800367091a1cc4f2e17ccf5a58545e64b5d945"} Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.150319 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"bb14aa2e976a1df97b904567ba00cc9b20d3fcf2303ea588583592366a6f893b"} Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.169077 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" podStartSLOduration=1.169050401 podStartE2EDuration="1.169050401s" podCreationTimestamp="2025-10-01 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:01.165252219 +0000 UTC m=+992.264817206" watchObservedRunningTime="2025-10-01 11:45:01.169050401 +0000 UTC m=+992.268615388" Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.625436 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.697850 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw5zk\" (UniqueName: \"kubernetes.io/projected/6ce18eda-e988-4897-bd2f-8e656c93b271-kube-api-access-gw5zk\") pod \"6ce18eda-e988-4897-bd2f-8e656c93b271\" (UID: \"6ce18eda-e988-4897-bd2f-8e656c93b271\") " Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.715578 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce18eda-e988-4897-bd2f-8e656c93b271-kube-api-access-gw5zk" (OuterVolumeSpecName: "kube-api-access-gw5zk") pod "6ce18eda-e988-4897-bd2f-8e656c93b271" (UID: "6ce18eda-e988-4897-bd2f-8e656c93b271"). InnerVolumeSpecName "kube-api-access-gw5zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.800265 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw5zk\" (UniqueName: \"kubernetes.io/projected/6ce18eda-e988-4897-bd2f-8e656c93b271-kube-api-access-gw5zk\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.864127 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:45:01 crc kubenswrapper[4669]: I1001 11:45:01.864215 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.165346 4669 generic.go:334] "Generic (PLEG): container finished" podID="58dcf4ef-bc6a-4b6b-a976-370b66cc762c" containerID="2bb432865e250c0206120efca081e75961ca357d25d2529f5ae2670db2ef1c14" exitCode=0 Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.165486 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" event={"ID":"58dcf4ef-bc6a-4b6b-a976-370b66cc762c","Type":"ContainerDied","Data":"2bb432865e250c0206120efca081e75961ca357d25d2529f5ae2670db2ef1c14"} Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.185896 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5df9-account-create-j88kr" event={"ID":"6ce18eda-e988-4897-bd2f-8e656c93b271","Type":"ContainerDied","Data":"a8dba6dfec5c275228fd110678c32f4b4c81acad5443b1627de7eeef20aec9a7"} Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.185947 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8dba6dfec5c275228fd110678c32f4b4c81acad5443b1627de7eeef20aec9a7" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.186020 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5df9-account-create-j88kr" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.488630 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5593-account-create-gvtw4"] Oct 01 11:45:02 crc kubenswrapper[4669]: E1001 11:45:02.489546 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce18eda-e988-4897-bd2f-8e656c93b271" containerName="mariadb-account-create" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.489566 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce18eda-e988-4897-bd2f-8e656c93b271" containerName="mariadb-account-create" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.489783 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce18eda-e988-4897-bd2f-8e656c93b271" containerName="mariadb-account-create" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.490576 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.495801 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-plhdj" podUID="c5ffe639-af06-4c4c-8794-a1becff8a692" containerName="ovn-controller" probeResult="failure" output=< Oct 01 11:45:02 crc kubenswrapper[4669]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 11:45:02 crc kubenswrapper[4669]: > Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.496124 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.509276 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.509744 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5593-account-create-gvtw4"] Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.512774 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvnc\" (UniqueName: \"kubernetes.io/projected/c280659a-1e6a-4f60-b793-0147fe1b4ecf-kube-api-access-8tvnc\") pod \"keystone-5593-account-create-gvtw4\" (UID: \"c280659a-1e6a-4f60-b793-0147fe1b4ecf\") " pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.537827 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-d5fz7" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.615345 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvnc\" (UniqueName: \"kubernetes.io/projected/c280659a-1e6a-4f60-b793-0147fe1b4ecf-kube-api-access-8tvnc\") pod \"keystone-5593-account-create-gvtw4\" (UID: \"c280659a-1e6a-4f60-b793-0147fe1b4ecf\") " pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.639120 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvnc\" (UniqueName: \"kubernetes.io/projected/c280659a-1e6a-4f60-b793-0147fe1b4ecf-kube-api-access-8tvnc\") pod \"keystone-5593-account-create-gvtw4\" (UID: \"c280659a-1e6a-4f60-b793-0147fe1b4ecf\") " pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.765934 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-plhdj-config-hnblz"] Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.768859 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.783613 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.790803 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plhdj-config-hnblz"] Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.818864 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-scripts\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.818927 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtd9j\" (UniqueName: \"kubernetes.io/projected/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-kube-api-access-dtd9j\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.818996 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-log-ovn\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.819015 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.819035 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-additional-scripts\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.819053 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run-ovn\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.823635 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.870168 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-392f-account-create-vvqc8"] Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.871619 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.880566 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.922912 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-scripts\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.922999 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtd9j\" (UniqueName: \"kubernetes.io/projected/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-kube-api-access-dtd9j\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923076 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rj9q\" (UniqueName: \"kubernetes.io/projected/ddf38126-0bdb-4a19-86d7-469597350f4f-kube-api-access-4rj9q\") pod \"placement-392f-account-create-vvqc8\" (UID: \"ddf38126-0bdb-4a19-86d7-469597350f4f\") " pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923130 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-log-ovn\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923148 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923169 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-additional-scripts\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923188 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run-ovn\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923553 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run-ovn\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.923616 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-log-ovn\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.924225 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.924454 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-additional-scripts\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.926337 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-scripts\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.926390 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-392f-account-create-vvqc8"] Oct 01 11:45:02 crc kubenswrapper[4669]: I1001 11:45:02.985216 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtd9j\" (UniqueName: \"kubernetes.io/projected/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-kube-api-access-dtd9j\") pod \"ovn-controller-plhdj-config-hnblz\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.026013 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rj9q\" (UniqueName: \"kubernetes.io/projected/ddf38126-0bdb-4a19-86d7-469597350f4f-kube-api-access-4rj9q\") pod \"placement-392f-account-create-vvqc8\" (UID: \"ddf38126-0bdb-4a19-86d7-469597350f4f\") " pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.064335 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rj9q\" (UniqueName: \"kubernetes.io/projected/ddf38126-0bdb-4a19-86d7-469597350f4f-kube-api-access-4rj9q\") pod \"placement-392f-account-create-vvqc8\" (UID: \"ddf38126-0bdb-4a19-86d7-469597350f4f\") " pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.108547 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.219042 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"c412b50b4aedd9a663db971fd38f4236fec0a2b75d11c978023d9d22e55a2418"} Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.219101 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"feaa69965cdcb7140bd181fc80ad9468efc61dc4124c04136298f06b36b07872"} Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.283445 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.416392 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s89kf"] Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.417977 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.421025 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s89kf"] Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.421762 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.421810 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wkq9x" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.436593 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-combined-ca-bundle\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.436706 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvggx\" (UniqueName: \"kubernetes.io/projected/6c85d289-ff7f-4b57-a54a-cb272dec58e2-kube-api-access-kvggx\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.436734 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-db-sync-config-data\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.436797 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-config-data\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.470427 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5593-account-create-gvtw4"] Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.541313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-config-data\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.541382 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-combined-ca-bundle\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.541447 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvggx\" (UniqueName: \"kubernetes.io/projected/6c85d289-ff7f-4b57-a54a-cb272dec58e2-kube-api-access-kvggx\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.541465 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-db-sync-config-data\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.551154 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-db-sync-config-data\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.552322 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-config-data\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.555931 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-combined-ca-bundle\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.573370 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvggx\" (UniqueName: \"kubernetes.io/projected/6c85d289-ff7f-4b57-a54a-cb272dec58e2-kube-api-access-kvggx\") pod \"glance-db-sync-s89kf\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.606319 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.644452 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-config-volume" (OuterVolumeSpecName: "config-volume") pod "58dcf4ef-bc6a-4b6b-a976-370b66cc762c" (UID: "58dcf4ef-bc6a-4b6b-a976-370b66cc762c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.651476 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-config-volume\") pod \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.651567 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzs8n\" (UniqueName: \"kubernetes.io/projected/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-kube-api-access-mzs8n\") pod \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.652366 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-secret-volume\") pod \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\" (UID: \"58dcf4ef-bc6a-4b6b-a976-370b66cc762c\") " Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.654018 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.670561 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "58dcf4ef-bc6a-4b6b-a976-370b66cc762c" (UID: "58dcf4ef-bc6a-4b6b-a976-370b66cc762c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.671252 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-kube-api-access-mzs8n" (OuterVolumeSpecName: "kube-api-access-mzs8n") pod "58dcf4ef-bc6a-4b6b-a976-370b66cc762c" (UID: "58dcf4ef-bc6a-4b6b-a976-370b66cc762c"). InnerVolumeSpecName "kube-api-access-mzs8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.748177 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s89kf" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.755514 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.755557 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzs8n\" (UniqueName: \"kubernetes.io/projected/58dcf4ef-bc6a-4b6b-a976-370b66cc762c-kube-api-access-mzs8n\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.864226 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plhdj-config-hnblz"] Oct 01 11:45:03 crc kubenswrapper[4669]: I1001 11:45:03.943306 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-392f-account-create-vvqc8"] Oct 01 11:45:03 crc kubenswrapper[4669]: W1001 11:45:03.953495 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf38126_0bdb_4a19_86d7_469597350f4f.slice/crio-eb33eacb13f1814375f16ecbe986e51308844c7521e550c7888da4eca928323e WatchSource:0}: Error finding container eb33eacb13f1814375f16ecbe986e51308844c7521e550c7888da4eca928323e: Status 404 returned error can't find the container with id eb33eacb13f1814375f16ecbe986e51308844c7521e550c7888da4eca928323e Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.233750 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"681d4309-a9a8-4c2c-bf25-4619653187fd","Type":"ContainerStarted","Data":"c694a5d44a8a1d3c5451743785aa75de7cb5f1487692bb807017819f4134beba"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.237297 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-392f-account-create-vvqc8" event={"ID":"ddf38126-0bdb-4a19-86d7-469597350f4f","Type":"ContainerStarted","Data":"8542db483d60c269ea4a36a8741638b0b0bc3e35efa5c978c0cc59f783b9b7ed"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.237330 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-392f-account-create-vvqc8" event={"ID":"ddf38126-0bdb-4a19-86d7-469597350f4f","Type":"ContainerStarted","Data":"eb33eacb13f1814375f16ecbe986e51308844c7521e550c7888da4eca928323e"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.239634 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.239651 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl" event={"ID":"58dcf4ef-bc6a-4b6b-a976-370b66cc762c","Type":"ContainerDied","Data":"8dfb4ad1ffa67b13dd3df00ea0a9f00e5d67555087de278b4d4fdc207e996fc0"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.239722 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dfb4ad1ffa67b13dd3df00ea0a9f00e5d67555087de278b4d4fdc207e996fc0" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.243248 4669 generic.go:334] "Generic (PLEG): container finished" podID="c280659a-1e6a-4f60-b793-0147fe1b4ecf" containerID="c049882bbc6dbbffe2abad1a14c6f12258de9c60de2cba855e31e56e5ed32b4b" exitCode=0 Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.243489 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5593-account-create-gvtw4" event={"ID":"c280659a-1e6a-4f60-b793-0147fe1b4ecf","Type":"ContainerDied","Data":"c049882bbc6dbbffe2abad1a14c6f12258de9c60de2cba855e31e56e5ed32b4b"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.243587 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5593-account-create-gvtw4" event={"ID":"c280659a-1e6a-4f60-b793-0147fe1b4ecf","Type":"ContainerStarted","Data":"dc6f092d45a2b1f24d3b3224c6774d9992bf1f0dfe3abbeead21a165ef6656a7"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.247495 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj-config-hnblz" event={"ID":"ee6d9300-dc8c-456b-aa8f-ff2d3800963d","Type":"ContainerStarted","Data":"a5a6af7fc1e0ea908f7895482e3ed39a884da2cb313073c67d904ca29f425aae"} Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.319256 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.303738448 podStartE2EDuration="30.319225252s" podCreationTimestamp="2025-10-01 11:44:34 +0000 UTC" firstStartedPulling="2025-10-01 11:44:52.268119357 +0000 UTC m=+983.367684334" lastFinishedPulling="2025-10-01 11:45:00.283606161 +0000 UTC m=+991.383171138" observedRunningTime="2025-10-01 11:45:04.315650336 +0000 UTC m=+995.415215313" watchObservedRunningTime="2025-10-01 11:45:04.319225252 +0000 UTC m=+995.418790230" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.352305 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-392f-account-create-vvqc8" podStartSLOduration=2.352282629 podStartE2EDuration="2.352282629s" podCreationTimestamp="2025-10-01 11:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:04.338361499 +0000 UTC m=+995.437926476" watchObservedRunningTime="2025-10-01 11:45:04.352282629 +0000 UTC m=+995.451847606" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.444964 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s89kf"] Oct 01 11:45:04 crc kubenswrapper[4669]: W1001 11:45:04.452895 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c85d289_ff7f_4b57_a54a_cb272dec58e2.slice/crio-291c9bd23cd59d41cbfd4c61d2e1dd77bbf9286038dfecc6226508fa1be80c33 WatchSource:0}: Error finding container 291c9bd23cd59d41cbfd4c61d2e1dd77bbf9286038dfecc6226508fa1be80c33: Status 404 returned error can't find the container with id 291c9bd23cd59d41cbfd4c61d2e1dd77bbf9286038dfecc6226508fa1be80c33 Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.638858 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-nwdj2"] Oct 01 11:45:04 crc kubenswrapper[4669]: E1001 11:45:04.639641 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dcf4ef-bc6a-4b6b-a976-370b66cc762c" containerName="collect-profiles" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.639667 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dcf4ef-bc6a-4b6b-a976-370b66cc762c" containerName="collect-profiles" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.639956 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="58dcf4ef-bc6a-4b6b-a976-370b66cc762c" containerName="collect-profiles" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.641156 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.660833 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.667797 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-nwdj2"] Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.775045 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.775213 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-config\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.775248 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.775292 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.775327 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvqx\" (UniqueName: \"kubernetes.io/projected/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-kube-api-access-8hvqx\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.775344 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.878300 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-config\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.878378 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.878429 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.878474 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvqx\" (UniqueName: \"kubernetes.io/projected/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-kube-api-access-8hvqx\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.878499 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.878575 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.879767 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.879834 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.879856 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.880193 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.880307 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-config\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:04 crc kubenswrapper[4669]: I1001 11:45:04.901095 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvqx\" (UniqueName: \"kubernetes.io/projected/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-kube-api-access-8hvqx\") pod \"dnsmasq-dns-77585f5f8c-nwdj2\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.057466 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.261851 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s89kf" event={"ID":"6c85d289-ff7f-4b57-a54a-cb272dec58e2","Type":"ContainerStarted","Data":"291c9bd23cd59d41cbfd4c61d2e1dd77bbf9286038dfecc6226508fa1be80c33"} Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.263737 4669 generic.go:334] "Generic (PLEG): container finished" podID="ddf38126-0bdb-4a19-86d7-469597350f4f" containerID="8542db483d60c269ea4a36a8741638b0b0bc3e35efa5c978c0cc59f783b9b7ed" exitCode=0 Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.263797 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-392f-account-create-vvqc8" event={"ID":"ddf38126-0bdb-4a19-86d7-469597350f4f","Type":"ContainerDied","Data":"8542db483d60c269ea4a36a8741638b0b0bc3e35efa5c978c0cc59f783b9b7ed"} Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.265037 4669 generic.go:334] "Generic (PLEG): container finished" podID="ee6d9300-dc8c-456b-aa8f-ff2d3800963d" containerID="4a0c94e531789cbc7fdb5d291d83f148863fe78b4310a3f2bac4b6c6abb91062" exitCode=0 Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.265195 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj-config-hnblz" event={"ID":"ee6d9300-dc8c-456b-aa8f-ff2d3800963d","Type":"ContainerDied","Data":"4a0c94e531789cbc7fdb5d291d83f148863fe78b4310a3f2bac4b6c6abb91062"} Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.714491 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.800867 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvnc\" (UniqueName: \"kubernetes.io/projected/c280659a-1e6a-4f60-b793-0147fe1b4ecf-kube-api-access-8tvnc\") pod \"c280659a-1e6a-4f60-b793-0147fe1b4ecf\" (UID: \"c280659a-1e6a-4f60-b793-0147fe1b4ecf\") " Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.838430 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c280659a-1e6a-4f60-b793-0147fe1b4ecf-kube-api-access-8tvnc" (OuterVolumeSpecName: "kube-api-access-8tvnc") pod "c280659a-1e6a-4f60-b793-0147fe1b4ecf" (UID: "c280659a-1e6a-4f60-b793-0147fe1b4ecf"). InnerVolumeSpecName "kube-api-access-8tvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.852802 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-nwdj2"] Oct 01 11:45:05 crc kubenswrapper[4669]: I1001 11:45:05.904007 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tvnc\" (UniqueName: \"kubernetes.io/projected/c280659a-1e6a-4f60-b793-0147fe1b4ecf-kube-api-access-8tvnc\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.277603 4669 generic.go:334] "Generic (PLEG): container finished" podID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerID="22fb6b1698440678931072c89b6015d5277af68319ec2fcee3ebf277a93d1272" exitCode=0 Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.277692 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" event={"ID":"941f43bf-37b4-451f-a1e9-53ebcbebd0f1","Type":"ContainerDied","Data":"22fb6b1698440678931072c89b6015d5277af68319ec2fcee3ebf277a93d1272"} Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.277730 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" event={"ID":"941f43bf-37b4-451f-a1e9-53ebcbebd0f1","Type":"ContainerStarted","Data":"ac30adf3cd7a90b7e445df7c0294c829674719d40c1e6d0bbc45eef04bbe15b9"} Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.280430 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5593-account-create-gvtw4" event={"ID":"c280659a-1e6a-4f60-b793-0147fe1b4ecf","Type":"ContainerDied","Data":"dc6f092d45a2b1f24d3b3224c6774d9992bf1f0dfe3abbeead21a165ef6656a7"} Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.280495 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6f092d45a2b1f24d3b3224c6774d9992bf1f0dfe3abbeead21a165ef6656a7" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.280862 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5593-account-create-gvtw4" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.630097 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.721446 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rj9q\" (UniqueName: \"kubernetes.io/projected/ddf38126-0bdb-4a19-86d7-469597350f4f-kube-api-access-4rj9q\") pod \"ddf38126-0bdb-4a19-86d7-469597350f4f\" (UID: \"ddf38126-0bdb-4a19-86d7-469597350f4f\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.728476 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf38126-0bdb-4a19-86d7-469597350f4f-kube-api-access-4rj9q" (OuterVolumeSpecName: "kube-api-access-4rj9q") pod "ddf38126-0bdb-4a19-86d7-469597350f4f" (UID: "ddf38126-0bdb-4a19-86d7-469597350f4f"). InnerVolumeSpecName "kube-api-access-4rj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.784404 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.825872 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rj9q\" (UniqueName: \"kubernetes.io/projected/ddf38126-0bdb-4a19-86d7-469597350f4f-kube-api-access-4rj9q\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.927269 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run-ovn\") pod \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.927341 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-scripts\") pod \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.927462 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-log-ovn\") pod \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.927593 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtd9j\" (UniqueName: \"kubernetes.io/projected/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-kube-api-access-dtd9j\") pod \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.927635 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-additional-scripts\") pod \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.927748 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run\") pod \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\" (UID: \"ee6d9300-dc8c-456b-aa8f-ff2d3800963d\") " Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.928256 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run" (OuterVolumeSpecName: "var-run") pod "ee6d9300-dc8c-456b-aa8f-ff2d3800963d" (UID: "ee6d9300-dc8c-456b-aa8f-ff2d3800963d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.928304 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ee6d9300-dc8c-456b-aa8f-ff2d3800963d" (UID: "ee6d9300-dc8c-456b-aa8f-ff2d3800963d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.929256 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ee6d9300-dc8c-456b-aa8f-ff2d3800963d" (UID: "ee6d9300-dc8c-456b-aa8f-ff2d3800963d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.929310 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ee6d9300-dc8c-456b-aa8f-ff2d3800963d" (UID: "ee6d9300-dc8c-456b-aa8f-ff2d3800963d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.929591 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-scripts" (OuterVolumeSpecName: "scripts") pod "ee6d9300-dc8c-456b-aa8f-ff2d3800963d" (UID: "ee6d9300-dc8c-456b-aa8f-ff2d3800963d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:06 crc kubenswrapper[4669]: I1001 11:45:06.935570 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-kube-api-access-dtd9j" (OuterVolumeSpecName: "kube-api-access-dtd9j") pod "ee6d9300-dc8c-456b-aa8f-ff2d3800963d" (UID: "ee6d9300-dc8c-456b-aa8f-ff2d3800963d"). InnerVolumeSpecName "kube-api-access-dtd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.030726 4669 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.030792 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtd9j\" (UniqueName: \"kubernetes.io/projected/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-kube-api-access-dtd9j\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.030803 4669 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.030813 4669 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.030822 4669 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.030831 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee6d9300-dc8c-456b-aa8f-ff2d3800963d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.293508 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-392f-account-create-vvqc8" event={"ID":"ddf38126-0bdb-4a19-86d7-469597350f4f","Type":"ContainerDied","Data":"eb33eacb13f1814375f16ecbe986e51308844c7521e550c7888da4eca928323e"} Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.293575 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb33eacb13f1814375f16ecbe986e51308844c7521e550c7888da4eca928323e" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.293570 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-392f-account-create-vvqc8" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.297877 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" event={"ID":"941f43bf-37b4-451f-a1e9-53ebcbebd0f1","Type":"ContainerStarted","Data":"a6ba5796a6d3416538a0af7dc289a2bc98f52a02d5a2bc45099a7dc04de54970"} Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.297987 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.302890 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj-config-hnblz" event={"ID":"ee6d9300-dc8c-456b-aa8f-ff2d3800963d","Type":"ContainerDied","Data":"a5a6af7fc1e0ea908f7895482e3ed39a884da2cb313073c67d904ca29f425aae"} Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.302937 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-hnblz" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.302949 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a6af7fc1e0ea908f7895482e3ed39a884da2cb313073c67d904ca29f425aae" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.321344 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" podStartSLOduration=3.321319443 podStartE2EDuration="3.321319443s" podCreationTimestamp="2025-10-01 11:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:07.317561002 +0000 UTC m=+998.417125979" watchObservedRunningTime="2025-10-01 11:45:07.321319443 +0000 UTC m=+998.420884420" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.510720 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-plhdj" Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.908601 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-plhdj-config-hnblz"] Oct 01 11:45:07 crc kubenswrapper[4669]: I1001 11:45:07.942993 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-plhdj-config-hnblz"] Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.060827 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-plhdj-config-4m4gk"] Oct 01 11:45:08 crc kubenswrapper[4669]: E1001 11:45:08.061474 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6d9300-dc8c-456b-aa8f-ff2d3800963d" containerName="ovn-config" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.061497 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6d9300-dc8c-456b-aa8f-ff2d3800963d" containerName="ovn-config" Oct 01 11:45:08 crc kubenswrapper[4669]: E1001 11:45:08.061525 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c280659a-1e6a-4f60-b793-0147fe1b4ecf" containerName="mariadb-account-create" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.061540 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c280659a-1e6a-4f60-b793-0147fe1b4ecf" containerName="mariadb-account-create" Oct 01 11:45:08 crc kubenswrapper[4669]: E1001 11:45:08.061568 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf38126-0bdb-4a19-86d7-469597350f4f" containerName="mariadb-account-create" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.061586 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf38126-0bdb-4a19-86d7-469597350f4f" containerName="mariadb-account-create" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.062003 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6d9300-dc8c-456b-aa8f-ff2d3800963d" containerName="ovn-config" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.062039 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c280659a-1e6a-4f60-b793-0147fe1b4ecf" containerName="mariadb-account-create" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.062211 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf38126-0bdb-4a19-86d7-469597350f4f" containerName="mariadb-account-create" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.063256 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.069452 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plhdj-config-4m4gk"] Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.097465 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.157805 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.157883 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-log-ovn\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.157916 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6rt\" (UniqueName: \"kubernetes.io/projected/60085ab1-2ed6-4050-bb56-a658aff45389-kube-api-access-wt6rt\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.157973 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-additional-scripts\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.158038 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run-ovn\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.158060 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-scripts\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.260704 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run-ovn\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.260794 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-scripts\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.260904 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.260943 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-log-ovn\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.260986 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6rt\" (UniqueName: \"kubernetes.io/projected/60085ab1-2ed6-4050-bb56-a658aff45389-kube-api-access-wt6rt\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.261026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-additional-scripts\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.261913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run-ovn\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.261927 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-log-ovn\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.262027 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.262391 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-additional-scripts\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.266883 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-scripts\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.280562 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6rt\" (UniqueName: \"kubernetes.io/projected/60085ab1-2ed6-4050-bb56-a658aff45389-kube-api-access-wt6rt\") pod \"ovn-controller-plhdj-config-4m4gk\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:08 crc kubenswrapper[4669]: I1001 11:45:08.421564 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.070685 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plhdj-config-4m4gk"] Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.328279 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj-config-4m4gk" event={"ID":"60085ab1-2ed6-4050-bb56-a658aff45389","Type":"ContainerStarted","Data":"350c1868d4de1403bf65a19ecbd79231c049238249c632e3590d257de0caed43"} Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.514369 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.666183 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6d9300-dc8c-456b-aa8f-ff2d3800963d" path="/var/lib/kubelet/pods/ee6d9300-dc8c-456b-aa8f-ff2d3800963d/volumes" Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.926869 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4stcf"] Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.928416 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:09 crc kubenswrapper[4669]: I1001 11:45:09.945368 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4stcf"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.007891 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4t7g\" (UniqueName: \"kubernetes.io/projected/b7377422-847a-48cf-9248-7126e2fda461-kube-api-access-w4t7g\") pod \"cinder-db-create-4stcf\" (UID: \"b7377422-847a-48cf-9248-7126e2fda461\") " pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.033836 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5srjm"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.035226 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.046876 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5srjm"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.109815 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwslg\" (UniqueName: \"kubernetes.io/projected/48969021-b816-4ae7-a52c-f26845df0580-kube-api-access-vwslg\") pod \"barbican-db-create-5srjm\" (UID: \"48969021-b816-4ae7-a52c-f26845df0580\") " pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.110255 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4t7g\" (UniqueName: \"kubernetes.io/projected/b7377422-847a-48cf-9248-7126e2fda461-kube-api-access-w4t7g\") pod \"cinder-db-create-4stcf\" (UID: \"b7377422-847a-48cf-9248-7126e2fda461\") " pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.148393 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.151641 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4t7g\" (UniqueName: \"kubernetes.io/projected/b7377422-847a-48cf-9248-7126e2fda461-kube-api-access-w4t7g\") pod \"cinder-db-create-4stcf\" (UID: \"b7377422-847a-48cf-9248-7126e2fda461\") " pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.212737 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwslg\" (UniqueName: \"kubernetes.io/projected/48969021-b816-4ae7-a52c-f26845df0580-kube-api-access-vwslg\") pod \"barbican-db-create-5srjm\" (UID: \"48969021-b816-4ae7-a52c-f26845df0580\") " pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.236485 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwslg\" (UniqueName: \"kubernetes.io/projected/48969021-b816-4ae7-a52c-f26845df0580-kube-api-access-vwslg\") pod \"barbican-db-create-5srjm\" (UID: \"48969021-b816-4ae7-a52c-f26845df0580\") " pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.247931 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.266694 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jpbz8"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.273297 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.291051 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jpbz8"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.319600 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx7jt\" (UniqueName: \"kubernetes.io/projected/60db4b61-e005-45c2-a41c-9ba9e7709a90-kube-api-access-dx7jt\") pod \"neutron-db-create-jpbz8\" (UID: \"60db4b61-e005-45c2-a41c-9ba9e7709a90\") " pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.361700 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.386039 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ttgc2"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.389121 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.396472 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fz8wd" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.396306 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.396608 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.397048 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.410103 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ttgc2"] Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.411607 4669 generic.go:334] "Generic (PLEG): container finished" podID="60085ab1-2ed6-4050-bb56-a658aff45389" containerID="369c0c5fd26b948f45b9ba644bad9af4138341597e162def31a60aa135c2634d" exitCode=0 Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.411653 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj-config-4m4gk" event={"ID":"60085ab1-2ed6-4050-bb56-a658aff45389","Type":"ContainerDied","Data":"369c0c5fd26b948f45b9ba644bad9af4138341597e162def31a60aa135c2634d"} Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.436802 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx7jt\" (UniqueName: \"kubernetes.io/projected/60db4b61-e005-45c2-a41c-9ba9e7709a90-kube-api-access-dx7jt\") pod \"neutron-db-create-jpbz8\" (UID: \"60db4b61-e005-45c2-a41c-9ba9e7709a90\") " pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.472654 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx7jt\" (UniqueName: \"kubernetes.io/projected/60db4b61-e005-45c2-a41c-9ba9e7709a90-kube-api-access-dx7jt\") pod \"neutron-db-create-jpbz8\" (UID: \"60db4b61-e005-45c2-a41c-9ba9e7709a90\") " pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.542570 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p887\" (UniqueName: \"kubernetes.io/projected/22e5eb36-78d1-4d8c-85ec-330fae011103-kube-api-access-8p887\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.542630 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-config-data\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.542794 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-combined-ca-bundle\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.644333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-combined-ca-bundle\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.644430 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p887\" (UniqueName: \"kubernetes.io/projected/22e5eb36-78d1-4d8c-85ec-330fae011103-kube-api-access-8p887\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.644452 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-config-data\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.651538 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-config-data\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.661449 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-combined-ca-bundle\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.664748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p887\" (UniqueName: \"kubernetes.io/projected/22e5eb36-78d1-4d8c-85ec-330fae011103-kube-api-access-8p887\") pod \"keystone-db-sync-ttgc2\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.691671 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.756903 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.938070 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5srjm"] Oct 01 11:45:10 crc kubenswrapper[4669]: W1001 11:45:10.961116 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48969021_b816_4ae7_a52c_f26845df0580.slice/crio-e1f25482d1d3d2dce5453ab8beb671a8894f6775d99adeea5f53f00e47cd3af9 WatchSource:0}: Error finding container e1f25482d1d3d2dce5453ab8beb671a8894f6775d99adeea5f53f00e47cd3af9: Status 404 returned error can't find the container with id e1f25482d1d3d2dce5453ab8beb671a8894f6775d99adeea5f53f00e47cd3af9 Oct 01 11:45:10 crc kubenswrapper[4669]: I1001 11:45:10.961951 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4stcf"] Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.105946 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ttgc2"] Oct 01 11:45:11 crc kubenswrapper[4669]: W1001 11:45:11.132587 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22e5eb36_78d1_4d8c_85ec_330fae011103.slice/crio-b8740600e74a1fff814b188383acdc525abfe6b1a56b4076ca17a647ce2df24e WatchSource:0}: Error finding container b8740600e74a1fff814b188383acdc525abfe6b1a56b4076ca17a647ce2df24e: Status 404 returned error can't find the container with id b8740600e74a1fff814b188383acdc525abfe6b1a56b4076ca17a647ce2df24e Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.219766 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jpbz8"] Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.435740 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ttgc2" event={"ID":"22e5eb36-78d1-4d8c-85ec-330fae011103","Type":"ContainerStarted","Data":"b8740600e74a1fff814b188383acdc525abfe6b1a56b4076ca17a647ce2df24e"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.444846 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4stcf" event={"ID":"b7377422-847a-48cf-9248-7126e2fda461","Type":"ContainerStarted","Data":"8712159fecc46b7da0621abb792f3ac80a839cf4e6caa9246907464a882aaafb"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.444878 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4stcf" event={"ID":"b7377422-847a-48cf-9248-7126e2fda461","Type":"ContainerStarted","Data":"b796f7a4bf5862d7d5457900b1a38590d11cb87efdf452dad2652331a6e1f64e"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.463676 4669 generic.go:334] "Generic (PLEG): container finished" podID="48969021-b816-4ae7-a52c-f26845df0580" containerID="eaffd0b2a5721d1440159064d5ae31f27b5c8556da99e327b69e1af73951ee1a" exitCode=0 Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.463750 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5srjm" event={"ID":"48969021-b816-4ae7-a52c-f26845df0580","Type":"ContainerDied","Data":"eaffd0b2a5721d1440159064d5ae31f27b5c8556da99e327b69e1af73951ee1a"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.463795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5srjm" event={"ID":"48969021-b816-4ae7-a52c-f26845df0580","Type":"ContainerStarted","Data":"e1f25482d1d3d2dce5453ab8beb671a8894f6775d99adeea5f53f00e47cd3af9"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.475335 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jpbz8" event={"ID":"60db4b61-e005-45c2-a41c-9ba9e7709a90","Type":"ContainerStarted","Data":"fafbf17a957a30559a9f42b05f5c9e0630ba324f518feff44d8cdb84b167233b"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.475418 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jpbz8" event={"ID":"60db4b61-e005-45c2-a41c-9ba9e7709a90","Type":"ContainerStarted","Data":"45837b554a88e7ec94edffdae804fb5f80a623991cc70330d37f9ed197919a04"} Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.494990 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-4stcf" podStartSLOduration=2.494968491 podStartE2EDuration="2.494968491s" podCreationTimestamp="2025-10-01 11:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:11.494721355 +0000 UTC m=+1002.594286332" watchObservedRunningTime="2025-10-01 11:45:11.494968491 +0000 UTC m=+1002.594533468" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.577128 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-jpbz8" podStartSLOduration=1.577104844 podStartE2EDuration="1.577104844s" podCreationTimestamp="2025-10-01 11:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:11.574433339 +0000 UTC m=+1002.673998316" watchObservedRunningTime="2025-10-01 11:45:11.577104844 +0000 UTC m=+1002.676669821" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.852585 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.981944 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt6rt\" (UniqueName: \"kubernetes.io/projected/60085ab1-2ed6-4050-bb56-a658aff45389-kube-api-access-wt6rt\") pod \"60085ab1-2ed6-4050-bb56-a658aff45389\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982227 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-scripts\") pod \"60085ab1-2ed6-4050-bb56-a658aff45389\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982250 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run-ovn\") pod \"60085ab1-2ed6-4050-bb56-a658aff45389\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982322 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-log-ovn\") pod \"60085ab1-2ed6-4050-bb56-a658aff45389\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982378 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run\") pod \"60085ab1-2ed6-4050-bb56-a658aff45389\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982416 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-additional-scripts\") pod \"60085ab1-2ed6-4050-bb56-a658aff45389\" (UID: \"60085ab1-2ed6-4050-bb56-a658aff45389\") " Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982475 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "60085ab1-2ed6-4050-bb56-a658aff45389" (UID: "60085ab1-2ed6-4050-bb56-a658aff45389"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982528 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run" (OuterVolumeSpecName: "var-run") pod "60085ab1-2ed6-4050-bb56-a658aff45389" (UID: "60085ab1-2ed6-4050-bb56-a658aff45389"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982869 4669 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.983293 4669 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.983608 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "60085ab1-2ed6-4050-bb56-a658aff45389" (UID: "60085ab1-2ed6-4050-bb56-a658aff45389"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.982605 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "60085ab1-2ed6-4050-bb56-a658aff45389" (UID: "60085ab1-2ed6-4050-bb56-a658aff45389"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.983765 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-scripts" (OuterVolumeSpecName: "scripts") pod "60085ab1-2ed6-4050-bb56-a658aff45389" (UID: "60085ab1-2ed6-4050-bb56-a658aff45389"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:11 crc kubenswrapper[4669]: I1001 11:45:11.989710 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60085ab1-2ed6-4050-bb56-a658aff45389-kube-api-access-wt6rt" (OuterVolumeSpecName: "kube-api-access-wt6rt") pod "60085ab1-2ed6-4050-bb56-a658aff45389" (UID: "60085ab1-2ed6-4050-bb56-a658aff45389"). InnerVolumeSpecName "kube-api-access-wt6rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.085832 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt6rt\" (UniqueName: \"kubernetes.io/projected/60085ab1-2ed6-4050-bb56-a658aff45389-kube-api-access-wt6rt\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.085878 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.085888 4669 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60085ab1-2ed6-4050-bb56-a658aff45389-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.085900 4669 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60085ab1-2ed6-4050-bb56-a658aff45389-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.487229 4669 generic.go:334] "Generic (PLEG): container finished" podID="60db4b61-e005-45c2-a41c-9ba9e7709a90" containerID="fafbf17a957a30559a9f42b05f5c9e0630ba324f518feff44d8cdb84b167233b" exitCode=0 Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.487311 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jpbz8" event={"ID":"60db4b61-e005-45c2-a41c-9ba9e7709a90","Type":"ContainerDied","Data":"fafbf17a957a30559a9f42b05f5c9e0630ba324f518feff44d8cdb84b167233b"} Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.492257 4669 generic.go:334] "Generic (PLEG): container finished" podID="b7377422-847a-48cf-9248-7126e2fda461" containerID="8712159fecc46b7da0621abb792f3ac80a839cf4e6caa9246907464a882aaafb" exitCode=0 Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.492323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4stcf" event={"ID":"b7377422-847a-48cf-9248-7126e2fda461","Type":"ContainerDied","Data":"8712159fecc46b7da0621abb792f3ac80a839cf4e6caa9246907464a882aaafb"} Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.499848 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plhdj-config-4m4gk" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.499911 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plhdj-config-4m4gk" event={"ID":"60085ab1-2ed6-4050-bb56-a658aff45389","Type":"ContainerDied","Data":"350c1868d4de1403bf65a19ecbd79231c049238249c632e3590d257de0caed43"} Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.499937 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350c1868d4de1403bf65a19ecbd79231c049238249c632e3590d257de0caed43" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.933355 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.939478 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-plhdj-config-4m4gk"] Oct 01 11:45:12 crc kubenswrapper[4669]: I1001 11:45:12.949544 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-plhdj-config-4m4gk"] Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.033315 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwslg\" (UniqueName: \"kubernetes.io/projected/48969021-b816-4ae7-a52c-f26845df0580-kube-api-access-vwslg\") pod \"48969021-b816-4ae7-a52c-f26845df0580\" (UID: \"48969021-b816-4ae7-a52c-f26845df0580\") " Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.038227 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48969021-b816-4ae7-a52c-f26845df0580-kube-api-access-vwslg" (OuterVolumeSpecName: "kube-api-access-vwslg") pod "48969021-b816-4ae7-a52c-f26845df0580" (UID: "48969021-b816-4ae7-a52c-f26845df0580"). InnerVolumeSpecName "kube-api-access-vwslg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.135255 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwslg\" (UniqueName: \"kubernetes.io/projected/48969021-b816-4ae7-a52c-f26845df0580-kube-api-access-vwslg\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.531125 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5srjm" Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.531185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5srjm" event={"ID":"48969021-b816-4ae7-a52c-f26845df0580","Type":"ContainerDied","Data":"e1f25482d1d3d2dce5453ab8beb671a8894f6775d99adeea5f53f00e47cd3af9"} Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.531223 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1f25482d1d3d2dce5453ab8beb671a8894f6775d99adeea5f53f00e47cd3af9" Oct 01 11:45:13 crc kubenswrapper[4669]: I1001 11:45:13.689411 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60085ab1-2ed6-4050-bb56-a658aff45389" path="/var/lib/kubelet/pods/60085ab1-2ed6-4050-bb56-a658aff45389/volumes" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.059358 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.137393 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb7fm"] Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.137700 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-xb7fm" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="dnsmasq-dns" containerID="cri-o://9c022035fbe45c8d1dad8c7e48ba39d1b529f02b6d8a3cb1b94587890a72729f" gracePeriod=10 Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.222065 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.234578 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.273332 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xb7fm" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.418490 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx7jt\" (UniqueName: \"kubernetes.io/projected/60db4b61-e005-45c2-a41c-9ba9e7709a90-kube-api-access-dx7jt\") pod \"60db4b61-e005-45c2-a41c-9ba9e7709a90\" (UID: \"60db4b61-e005-45c2-a41c-9ba9e7709a90\") " Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.418575 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4t7g\" (UniqueName: \"kubernetes.io/projected/b7377422-847a-48cf-9248-7126e2fda461-kube-api-access-w4t7g\") pod \"b7377422-847a-48cf-9248-7126e2fda461\" (UID: \"b7377422-847a-48cf-9248-7126e2fda461\") " Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.425866 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60db4b61-e005-45c2-a41c-9ba9e7709a90-kube-api-access-dx7jt" (OuterVolumeSpecName: "kube-api-access-dx7jt") pod "60db4b61-e005-45c2-a41c-9ba9e7709a90" (UID: "60db4b61-e005-45c2-a41c-9ba9e7709a90"). InnerVolumeSpecName "kube-api-access-dx7jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.434294 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7377422-847a-48cf-9248-7126e2fda461-kube-api-access-w4t7g" (OuterVolumeSpecName: "kube-api-access-w4t7g") pod "b7377422-847a-48cf-9248-7126e2fda461" (UID: "b7377422-847a-48cf-9248-7126e2fda461"). InnerVolumeSpecName "kube-api-access-w4t7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.522018 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx7jt\" (UniqueName: \"kubernetes.io/projected/60db4b61-e005-45c2-a41c-9ba9e7709a90-kube-api-access-dx7jt\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.522155 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4t7g\" (UniqueName: \"kubernetes.io/projected/b7377422-847a-48cf-9248-7126e2fda461-kube-api-access-w4t7g\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.553692 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4stcf" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.553709 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4stcf" event={"ID":"b7377422-847a-48cf-9248-7126e2fda461","Type":"ContainerDied","Data":"b796f7a4bf5862d7d5457900b1a38590d11cb87efdf452dad2652331a6e1f64e"} Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.553760 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b796f7a4bf5862d7d5457900b1a38590d11cb87efdf452dad2652331a6e1f64e" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.557341 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jpbz8" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.557322 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jpbz8" event={"ID":"60db4b61-e005-45c2-a41c-9ba9e7709a90","Type":"ContainerDied","Data":"45837b554a88e7ec94edffdae804fb5f80a623991cc70330d37f9ed197919a04"} Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.557437 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45837b554a88e7ec94edffdae804fb5f80a623991cc70330d37f9ed197919a04" Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.560031 4669 generic.go:334] "Generic (PLEG): container finished" podID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerID="9c022035fbe45c8d1dad8c7e48ba39d1b529f02b6d8a3cb1b94587890a72729f" exitCode=0 Oct 01 11:45:15 crc kubenswrapper[4669]: I1001 11:45:15.560109 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb7fm" event={"ID":"60fd3501-3d20-401c-b46c-ebc2451bf0ce","Type":"ContainerDied","Data":"9c022035fbe45c8d1dad8c7e48ba39d1b529f02b6d8a3cb1b94587890a72729f"} Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.598770 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.605186 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb7fm" event={"ID":"60fd3501-3d20-401c-b46c-ebc2451bf0ce","Type":"ContainerDied","Data":"46d71328fb87b3940d665ae81dcec382b801e1421907fb626e54bf9c4f808b32"} Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.605283 4669 scope.go:117] "RemoveContainer" containerID="9c022035fbe45c8d1dad8c7e48ba39d1b529f02b6d8a3cb1b94587890a72729f" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.605215 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb7fm" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.615640 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlkrt\" (UniqueName: \"kubernetes.io/projected/60fd3501-3d20-401c-b46c-ebc2451bf0ce-kube-api-access-nlkrt\") pod \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.615684 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-config\") pod \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.615745 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-dns-svc\") pod \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.615779 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-nb\") pod \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.615810 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-sb\") pod \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\" (UID: \"60fd3501-3d20-401c-b46c-ebc2451bf0ce\") " Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.634771 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fd3501-3d20-401c-b46c-ebc2451bf0ce-kube-api-access-nlkrt" (OuterVolumeSpecName: "kube-api-access-nlkrt") pod "60fd3501-3d20-401c-b46c-ebc2451bf0ce" (UID: "60fd3501-3d20-401c-b46c-ebc2451bf0ce"). InnerVolumeSpecName "kube-api-access-nlkrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.689133 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60fd3501-3d20-401c-b46c-ebc2451bf0ce" (UID: "60fd3501-3d20-401c-b46c-ebc2451bf0ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.690899 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60fd3501-3d20-401c-b46c-ebc2451bf0ce" (UID: "60fd3501-3d20-401c-b46c-ebc2451bf0ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.695528 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60fd3501-3d20-401c-b46c-ebc2451bf0ce" (UID: "60fd3501-3d20-401c-b46c-ebc2451bf0ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.738305 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-config" (OuterVolumeSpecName: "config") pod "60fd3501-3d20-401c-b46c-ebc2451bf0ce" (UID: "60fd3501-3d20-401c-b46c-ebc2451bf0ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.740336 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlkrt\" (UniqueName: \"kubernetes.io/projected/60fd3501-3d20-401c-b46c-ebc2451bf0ce-kube-api-access-nlkrt\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.740372 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.740390 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.740404 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.740416 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60fd3501-3d20-401c-b46c-ebc2451bf0ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.956347 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb7fm"] Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.963384 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb7fm"] Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996246 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8dae-account-create-vx5r9"] Oct 01 11:45:19 crc kubenswrapper[4669]: E1001 11:45:19.996739 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7377422-847a-48cf-9248-7126e2fda461" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996761 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7377422-847a-48cf-9248-7126e2fda461" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: E1001 11:45:19.996779 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="init" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996787 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="init" Oct 01 11:45:19 crc kubenswrapper[4669]: E1001 11:45:19.996804 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60db4b61-e005-45c2-a41c-9ba9e7709a90" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996814 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="60db4b61-e005-45c2-a41c-9ba9e7709a90" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: E1001 11:45:19.996825 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="dnsmasq-dns" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996830 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="dnsmasq-dns" Oct 01 11:45:19 crc kubenswrapper[4669]: E1001 11:45:19.996861 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60085ab1-2ed6-4050-bb56-a658aff45389" containerName="ovn-config" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996879 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="60085ab1-2ed6-4050-bb56-a658aff45389" containerName="ovn-config" Oct 01 11:45:19 crc kubenswrapper[4669]: E1001 11:45:19.996889 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48969021-b816-4ae7-a52c-f26845df0580" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.996897 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="48969021-b816-4ae7-a52c-f26845df0580" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.997052 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" containerName="dnsmasq-dns" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.997068 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="60db4b61-e005-45c2-a41c-9ba9e7709a90" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.997097 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7377422-847a-48cf-9248-7126e2fda461" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.997107 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="48969021-b816-4ae7-a52c-f26845df0580" containerName="mariadb-database-create" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.997115 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="60085ab1-2ed6-4050-bb56-a658aff45389" containerName="ovn-config" Oct 01 11:45:19 crc kubenswrapper[4669]: I1001 11:45:19.997744 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:20 crc kubenswrapper[4669]: I1001 11:45:20.000263 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 11:45:20 crc kubenswrapper[4669]: I1001 11:45:20.003969 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8dae-account-create-vx5r9"] Oct 01 11:45:20 crc kubenswrapper[4669]: I1001 11:45:20.156248 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcrn\" (UniqueName: \"kubernetes.io/projected/9b4e7624-e1b8-47b7-a7de-5566e2180147-kube-api-access-vbcrn\") pod \"barbican-8dae-account-create-vx5r9\" (UID: \"9b4e7624-e1b8-47b7-a7de-5566e2180147\") " pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:20 crc kubenswrapper[4669]: I1001 11:45:20.257917 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcrn\" (UniqueName: \"kubernetes.io/projected/9b4e7624-e1b8-47b7-a7de-5566e2180147-kube-api-access-vbcrn\") pod \"barbican-8dae-account-create-vx5r9\" (UID: \"9b4e7624-e1b8-47b7-a7de-5566e2180147\") " pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:20 crc kubenswrapper[4669]: I1001 11:45:20.281029 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcrn\" (UniqueName: \"kubernetes.io/projected/9b4e7624-e1b8-47b7-a7de-5566e2180147-kube-api-access-vbcrn\") pod \"barbican-8dae-account-create-vx5r9\" (UID: \"9b4e7624-e1b8-47b7-a7de-5566e2180147\") " pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:20 crc kubenswrapper[4669]: I1001 11:45:20.329920 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:21 crc kubenswrapper[4669]: I1001 11:45:21.662231 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fd3501-3d20-401c-b46c-ebc2451bf0ce" path="/var/lib/kubelet/pods/60fd3501-3d20-401c-b46c-ebc2451bf0ce/volumes" Oct 01 11:45:27 crc kubenswrapper[4669]: E1001 11:45:27.442194 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 01 11:45:27 crc kubenswrapper[4669]: E1001 11:45:27.443276 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvggx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-s89kf_openstack(6c85d289-ff7f-4b57-a54a-cb272dec58e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:45:27 crc kubenswrapper[4669]: E1001 11:45:27.444857 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-s89kf" podUID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" Oct 01 11:45:27 crc kubenswrapper[4669]: I1001 11:45:27.461302 4669 scope.go:117] "RemoveContainer" containerID="67676c9744e09cef9d723fc7f389d796be3188532f95e17ddf62d1a8339c4690" Oct 01 11:45:27 crc kubenswrapper[4669]: E1001 11:45:27.720916 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-s89kf" podUID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" Oct 01 11:45:27 crc kubenswrapper[4669]: I1001 11:45:27.949874 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8dae-account-create-vx5r9"] Oct 01 11:45:27 crc kubenswrapper[4669]: W1001 11:45:27.962525 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4e7624_e1b8_47b7_a7de_5566e2180147.slice/crio-f6fdbe2751daa37c7574424e5effa83a4945ea0815746fcb9929f537a3fdb9c3 WatchSource:0}: Error finding container f6fdbe2751daa37c7574424e5effa83a4945ea0815746fcb9929f537a3fdb9c3: Status 404 returned error can't find the container with id f6fdbe2751daa37c7574424e5effa83a4945ea0815746fcb9929f537a3fdb9c3 Oct 01 11:45:28 crc kubenswrapper[4669]: I1001 11:45:28.743544 4669 generic.go:334] "Generic (PLEG): container finished" podID="9b4e7624-e1b8-47b7-a7de-5566e2180147" containerID="03a0061ec24c0c5ec4f3119fa161aec560d6d14ccdf5805e83305e638168f575" exitCode=0 Oct 01 11:45:28 crc kubenswrapper[4669]: I1001 11:45:28.744127 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8dae-account-create-vx5r9" event={"ID":"9b4e7624-e1b8-47b7-a7de-5566e2180147","Type":"ContainerDied","Data":"03a0061ec24c0c5ec4f3119fa161aec560d6d14ccdf5805e83305e638168f575"} Oct 01 11:45:28 crc kubenswrapper[4669]: I1001 11:45:28.744179 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8dae-account-create-vx5r9" event={"ID":"9b4e7624-e1b8-47b7-a7de-5566e2180147","Type":"ContainerStarted","Data":"f6fdbe2751daa37c7574424e5effa83a4945ea0815746fcb9929f537a3fdb9c3"} Oct 01 11:45:28 crc kubenswrapper[4669]: I1001 11:45:28.750356 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ttgc2" event={"ID":"22e5eb36-78d1-4d8c-85ec-330fae011103","Type":"ContainerStarted","Data":"c24bbdba316afc8a8949f202d2513184086ae3dcafd406e7fbfc58eeb61fd282"} Oct 01 11:45:28 crc kubenswrapper[4669]: I1001 11:45:28.794675 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ttgc2" podStartSLOduration=2.461407846 podStartE2EDuration="18.794633164s" podCreationTimestamp="2025-10-01 11:45:10 +0000 UTC" firstStartedPulling="2025-10-01 11:45:11.144152407 +0000 UTC m=+1002.243717374" lastFinishedPulling="2025-10-01 11:45:27.477377695 +0000 UTC m=+1018.576942692" observedRunningTime="2025-10-01 11:45:28.791245562 +0000 UTC m=+1019.890810619" watchObservedRunningTime="2025-10-01 11:45:28.794633164 +0000 UTC m=+1019.894198181" Oct 01 11:45:29 crc kubenswrapper[4669]: I1001 11:45:29.949671 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a0d9-account-create-m6bsh"] Oct 01 11:45:29 crc kubenswrapper[4669]: I1001 11:45:29.953668 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:29 crc kubenswrapper[4669]: I1001 11:45:29.959161 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 11:45:29 crc kubenswrapper[4669]: I1001 11:45:29.963183 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a0d9-account-create-m6bsh"] Oct 01 11:45:29 crc kubenswrapper[4669]: I1001 11:45:29.985229 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/8b3a2882-862b-4f3d-91e2-0f18d1960a91-kube-api-access-8x5lh\") pod \"cinder-a0d9-account-create-m6bsh\" (UID: \"8b3a2882-862b-4f3d-91e2-0f18d1960a91\") " pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.087315 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/8b3a2882-862b-4f3d-91e2-0f18d1960a91-kube-api-access-8x5lh\") pod \"cinder-a0d9-account-create-m6bsh\" (UID: \"8b3a2882-862b-4f3d-91e2-0f18d1960a91\") " pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.109402 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/8b3a2882-862b-4f3d-91e2-0f18d1960a91-kube-api-access-8x5lh\") pod \"cinder-a0d9-account-create-m6bsh\" (UID: \"8b3a2882-862b-4f3d-91e2-0f18d1960a91\") " pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.159001 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.282894 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5197-account-create-5fftd"] Oct 01 11:45:30 crc kubenswrapper[4669]: E1001 11:45:30.283470 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4e7624-e1b8-47b7-a7de-5566e2180147" containerName="mariadb-account-create" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.283492 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4e7624-e1b8-47b7-a7de-5566e2180147" containerName="mariadb-account-create" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.283741 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4e7624-e1b8-47b7-a7de-5566e2180147" containerName="mariadb-account-create" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.284761 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.287155 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.290462 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5197-account-create-5fftd"] Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.291059 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbcrn\" (UniqueName: \"kubernetes.io/projected/9b4e7624-e1b8-47b7-a7de-5566e2180147-kube-api-access-vbcrn\") pod \"9b4e7624-e1b8-47b7-a7de-5566e2180147\" (UID: \"9b4e7624-e1b8-47b7-a7de-5566e2180147\") " Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.292061 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjwl\" (UniqueName: \"kubernetes.io/projected/ca2aebeb-0425-4f89-b3a9-be541ae1e07c-kube-api-access-zqjwl\") pod \"neutron-5197-account-create-5fftd\" (UID: \"ca2aebeb-0425-4f89-b3a9-be541ae1e07c\") " pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.293329 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:30 crc kubenswrapper[4669]: I1001 11:45:30.295259 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4e7624-e1b8-47b7-a7de-5566e2180147-kube-api-access-vbcrn" (OuterVolumeSpecName: "kube-api-access-vbcrn") pod "9b4e7624-e1b8-47b7-a7de-5566e2180147" (UID: "9b4e7624-e1b8-47b7-a7de-5566e2180147"). InnerVolumeSpecName "kube-api-access-vbcrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.397100 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjwl\" (UniqueName: \"kubernetes.io/projected/ca2aebeb-0425-4f89-b3a9-be541ae1e07c-kube-api-access-zqjwl\") pod \"neutron-5197-account-create-5fftd\" (UID: \"ca2aebeb-0425-4f89-b3a9-be541ae1e07c\") " pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.397202 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbcrn\" (UniqueName: \"kubernetes.io/projected/9b4e7624-e1b8-47b7-a7de-5566e2180147-kube-api-access-vbcrn\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.444331 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjwl\" (UniqueName: \"kubernetes.io/projected/ca2aebeb-0425-4f89-b3a9-be541ae1e07c-kube-api-access-zqjwl\") pod \"neutron-5197-account-create-5fftd\" (UID: \"ca2aebeb-0425-4f89-b3a9-be541ae1e07c\") " pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.732202 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.787702 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8dae-account-create-vx5r9" event={"ID":"9b4e7624-e1b8-47b7-a7de-5566e2180147","Type":"ContainerDied","Data":"f6fdbe2751daa37c7574424e5effa83a4945ea0815746fcb9929f537a3fdb9c3"} Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.788322 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6fdbe2751daa37c7574424e5effa83a4945ea0815746fcb9929f537a3fdb9c3" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:30.787825 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8dae-account-create-vx5r9" Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:31.780604 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5197-account-create-5fftd"] Oct 01 11:45:31 crc kubenswrapper[4669]: W1001 11:45:31.787059 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca2aebeb_0425_4f89_b3a9_be541ae1e07c.slice/crio-c818f0b93326021c5280d7c0a85e47db7cd190cc0a2a1a6e49cdcae8d3fa38c1 WatchSource:0}: Error finding container c818f0b93326021c5280d7c0a85e47db7cd190cc0a2a1a6e49cdcae8d3fa38c1: Status 404 returned error can't find the container with id c818f0b93326021c5280d7c0a85e47db7cd190cc0a2a1a6e49cdcae8d3fa38c1 Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:31.797610 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a0d9-account-create-m6bsh"] Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:31.806292 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5197-account-create-5fftd" event={"ID":"ca2aebeb-0425-4f89-b3a9-be541ae1e07c","Type":"ContainerStarted","Data":"c818f0b93326021c5280d7c0a85e47db7cd190cc0a2a1a6e49cdcae8d3fa38c1"} Oct 01 11:45:31 crc kubenswrapper[4669]: W1001 11:45:31.821204 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b3a2882_862b_4f3d_91e2_0f18d1960a91.slice/crio-9907903f91f44c0500eb0bae94cd19c022d46ab61d9d7491ae1043f40ad33a28 WatchSource:0}: Error finding container 9907903f91f44c0500eb0bae94cd19c022d46ab61d9d7491ae1043f40ad33a28: Status 404 returned error can't find the container with id 9907903f91f44c0500eb0bae94cd19c022d46ab61d9d7491ae1043f40ad33a28 Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:31.863522 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:45:31 crc kubenswrapper[4669]: I1001 11:45:31.863606 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.818880 4669 generic.go:334] "Generic (PLEG): container finished" podID="ca2aebeb-0425-4f89-b3a9-be541ae1e07c" containerID="86aa968b793f6e17beb2a4767540d3201a44fefdd39639ca30686796c0f121a5" exitCode=0 Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.818950 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5197-account-create-5fftd" event={"ID":"ca2aebeb-0425-4f89-b3a9-be541ae1e07c","Type":"ContainerDied","Data":"86aa968b793f6e17beb2a4767540d3201a44fefdd39639ca30686796c0f121a5"} Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.822796 4669 generic.go:334] "Generic (PLEG): container finished" podID="22e5eb36-78d1-4d8c-85ec-330fae011103" containerID="c24bbdba316afc8a8949f202d2513184086ae3dcafd406e7fbfc58eeb61fd282" exitCode=0 Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.822896 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ttgc2" event={"ID":"22e5eb36-78d1-4d8c-85ec-330fae011103","Type":"ContainerDied","Data":"c24bbdba316afc8a8949f202d2513184086ae3dcafd406e7fbfc58eeb61fd282"} Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.825394 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b3a2882-862b-4f3d-91e2-0f18d1960a91" containerID="efd7eb073d20d63343033600c20646c1007af88a45c40f636ac0309656383f1f" exitCode=0 Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.825442 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a0d9-account-create-m6bsh" event={"ID":"8b3a2882-862b-4f3d-91e2-0f18d1960a91","Type":"ContainerDied","Data":"efd7eb073d20d63343033600c20646c1007af88a45c40f636ac0309656383f1f"} Oct 01 11:45:32 crc kubenswrapper[4669]: I1001 11:45:32.825475 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a0d9-account-create-m6bsh" event={"ID":"8b3a2882-862b-4f3d-91e2-0f18d1960a91","Type":"ContainerStarted","Data":"9907903f91f44c0500eb0bae94cd19c022d46ab61d9d7491ae1043f40ad33a28"} Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.286008 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.390173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/8b3a2882-862b-4f3d-91e2-0f18d1960a91-kube-api-access-8x5lh\") pod \"8b3a2882-862b-4f3d-91e2-0f18d1960a91\" (UID: \"8b3a2882-862b-4f3d-91e2-0f18d1960a91\") " Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.396785 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.397479 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3a2882-862b-4f3d-91e2-0f18d1960a91-kube-api-access-8x5lh" (OuterVolumeSpecName: "kube-api-access-8x5lh") pod "8b3a2882-862b-4f3d-91e2-0f18d1960a91" (UID: "8b3a2882-862b-4f3d-91e2-0f18d1960a91"). InnerVolumeSpecName "kube-api-access-8x5lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.450336 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.492934 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/8b3a2882-862b-4f3d-91e2-0f18d1960a91-kube-api-access-8x5lh\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.595073 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjwl\" (UniqueName: \"kubernetes.io/projected/ca2aebeb-0425-4f89-b3a9-be541ae1e07c-kube-api-access-zqjwl\") pod \"ca2aebeb-0425-4f89-b3a9-be541ae1e07c\" (UID: \"ca2aebeb-0425-4f89-b3a9-be541ae1e07c\") " Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.595412 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-combined-ca-bundle\") pod \"22e5eb36-78d1-4d8c-85ec-330fae011103\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.595496 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p887\" (UniqueName: \"kubernetes.io/projected/22e5eb36-78d1-4d8c-85ec-330fae011103-kube-api-access-8p887\") pod \"22e5eb36-78d1-4d8c-85ec-330fae011103\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.595632 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-config-data\") pod \"22e5eb36-78d1-4d8c-85ec-330fae011103\" (UID: \"22e5eb36-78d1-4d8c-85ec-330fae011103\") " Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.599020 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e5eb36-78d1-4d8c-85ec-330fae011103-kube-api-access-8p887" (OuterVolumeSpecName: "kube-api-access-8p887") pod "22e5eb36-78d1-4d8c-85ec-330fae011103" (UID: "22e5eb36-78d1-4d8c-85ec-330fae011103"). InnerVolumeSpecName "kube-api-access-8p887". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.601531 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2aebeb-0425-4f89-b3a9-be541ae1e07c-kube-api-access-zqjwl" (OuterVolumeSpecName: "kube-api-access-zqjwl") pod "ca2aebeb-0425-4f89-b3a9-be541ae1e07c" (UID: "ca2aebeb-0425-4f89-b3a9-be541ae1e07c"). InnerVolumeSpecName "kube-api-access-zqjwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.627029 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22e5eb36-78d1-4d8c-85ec-330fae011103" (UID: "22e5eb36-78d1-4d8c-85ec-330fae011103"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.662848 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-config-data" (OuterVolumeSpecName: "config-data") pod "22e5eb36-78d1-4d8c-85ec-330fae011103" (UID: "22e5eb36-78d1-4d8c-85ec-330fae011103"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.698764 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.698818 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p887\" (UniqueName: \"kubernetes.io/projected/22e5eb36-78d1-4d8c-85ec-330fae011103-kube-api-access-8p887\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.698844 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e5eb36-78d1-4d8c-85ec-330fae011103-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.698862 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjwl\" (UniqueName: \"kubernetes.io/projected/ca2aebeb-0425-4f89-b3a9-be541ae1e07c-kube-api-access-zqjwl\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.851972 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5197-account-create-5fftd" event={"ID":"ca2aebeb-0425-4f89-b3a9-be541ae1e07c","Type":"ContainerDied","Data":"c818f0b93326021c5280d7c0a85e47db7cd190cc0a2a1a6e49cdcae8d3fa38c1"} Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.852030 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c818f0b93326021c5280d7c0a85e47db7cd190cc0a2a1a6e49cdcae8d3fa38c1" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.852042 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5197-account-create-5fftd" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.854760 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ttgc2" event={"ID":"22e5eb36-78d1-4d8c-85ec-330fae011103","Type":"ContainerDied","Data":"b8740600e74a1fff814b188383acdc525abfe6b1a56b4076ca17a647ce2df24e"} Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.854789 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8740600e74a1fff814b188383acdc525abfe6b1a56b4076ca17a647ce2df24e" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.854802 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ttgc2" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.856900 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a0d9-account-create-m6bsh" event={"ID":"8b3a2882-862b-4f3d-91e2-0f18d1960a91","Type":"ContainerDied","Data":"9907903f91f44c0500eb0bae94cd19c022d46ab61d9d7491ae1043f40ad33a28"} Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.856960 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9907903f91f44c0500eb0bae94cd19c022d46ab61d9d7491ae1043f40ad33a28" Oct 01 11:45:34 crc kubenswrapper[4669]: I1001 11:45:34.856975 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a0d9-account-create-m6bsh" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.180067 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-j54rq"] Oct 01 11:45:35 crc kubenswrapper[4669]: E1001 11:45:35.180983 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2aebeb-0425-4f89-b3a9-be541ae1e07c" containerName="mariadb-account-create" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.181010 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2aebeb-0425-4f89-b3a9-be541ae1e07c" containerName="mariadb-account-create" Oct 01 11:45:35 crc kubenswrapper[4669]: E1001 11:45:35.181028 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3a2882-862b-4f3d-91e2-0f18d1960a91" containerName="mariadb-account-create" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.181035 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3a2882-862b-4f3d-91e2-0f18d1960a91" containerName="mariadb-account-create" Oct 01 11:45:35 crc kubenswrapper[4669]: E1001 11:45:35.181105 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e5eb36-78d1-4d8c-85ec-330fae011103" containerName="keystone-db-sync" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.181117 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e5eb36-78d1-4d8c-85ec-330fae011103" containerName="keystone-db-sync" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.181331 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3a2882-862b-4f3d-91e2-0f18d1960a91" containerName="mariadb-account-create" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.181362 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e5eb36-78d1-4d8c-85ec-330fae011103" containerName="keystone-db-sync" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.181381 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2aebeb-0425-4f89-b3a9-be541ae1e07c" containerName="mariadb-account-create" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.182677 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.207457 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zgw2m"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.208795 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.212755 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.213008 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.213158 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fz8wd" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.213283 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.227336 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-j54rq"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.253476 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zgw2m"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310756 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310815 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310837 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310862 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310882 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk4d7\" (UniqueName: \"kubernetes.io/projected/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-kube-api-access-sk4d7\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310920 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-config\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.310986 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-config-data\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.311018 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-svc\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.311050 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-credential-keys\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.311106 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srwn8\" (UniqueName: \"kubernetes.io/projected/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-kube-api-access-srwn8\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.311185 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-scripts\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.311211 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-fernet-keys\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.387822 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7697d5fb49-4zsxr"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.392235 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.395187 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.395705 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.396595 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-l4wqh" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.403788 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414733 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-scripts\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414783 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-fernet-keys\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414816 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414845 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/007c4768-f1c7-4750-a403-9a930798b8fb-horizon-secret-key\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414870 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nbzf\" (UniqueName: \"kubernetes.io/projected/007c4768-f1c7-4750-a403-9a930798b8fb-kube-api-access-8nbzf\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414891 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414912 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414929 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414954 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk4d7\" (UniqueName: \"kubernetes.io/projected/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-kube-api-access-sk4d7\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.414994 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-config\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415141 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-config-data\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415210 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-config-data\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415259 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-svc\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415300 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-credential-keys\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415326 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007c4768-f1c7-4750-a403-9a930798b8fb-logs\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415363 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srwn8\" (UniqueName: \"kubernetes.io/projected/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-kube-api-access-srwn8\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.415456 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-scripts\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.416319 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.417129 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-config\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.417813 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.417913 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-svc\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.418717 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.424300 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7697d5fb49-4zsxr"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.433160 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-fernet-keys\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.435827 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.436133 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-config-data\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.444765 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-scripts\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.449711 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-credential-keys\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.449854 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srwn8\" (UniqueName: \"kubernetes.io/projected/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-kube-api-access-srwn8\") pod \"keystone-bootstrap-zgw2m\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.452692 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk4d7\" (UniqueName: \"kubernetes.io/projected/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-kube-api-access-sk4d7\") pod \"dnsmasq-dns-55fff446b9-j54rq\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.495155 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.504764 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.513581 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.513847 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.516161 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.517676 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/007c4768-f1c7-4750-a403-9a930798b8fb-horizon-secret-key\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.517722 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nbzf\" (UniqueName: \"kubernetes.io/projected/007c4768-f1c7-4750-a403-9a930798b8fb-kube-api-access-8nbzf\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.517854 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-config-data\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.517917 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007c4768-f1c7-4750-a403-9a930798b8fb-logs\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.518025 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-scripts\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.518901 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-scripts\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.519190 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007c4768-f1c7-4750-a403-9a930798b8fb-logs\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.519334 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-config-data\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.535141 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/007c4768-f1c7-4750-a403-9a930798b8fb-horizon-secret-key\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.559630 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nbzf\" (UniqueName: \"kubernetes.io/projected/007c4768-f1c7-4750-a403-9a930798b8fb-kube-api-access-8nbzf\") pod \"horizon-7697d5fb49-4zsxr\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.587455 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.594701 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.601458 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2cbqn"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.603109 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.605754 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xj549" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.606028 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.634904 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2cbqn"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.637482 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-log-httpd\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.637616 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-scripts\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.637698 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.637789 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.637859 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-run-httpd\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.637938 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txz9v\" (UniqueName: \"kubernetes.io/projected/4814501d-3b55-40bb-b932-41f91ca1d7fb-kube-api-access-txz9v\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.638024 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-config-data\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.638118 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-db-sync-config-data\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.638208 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-combined-ca-bundle\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.638595 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmhm\" (UniqueName: \"kubernetes.io/projected/e1ba96b9-d556-419e-a8a3-f90348499977-kube-api-access-jjmhm\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.690663 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8r7vt"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.692275 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f47dd5fdf-8bs76"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.694182 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.694707 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.697506 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.699917 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.700587 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mpsgs" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.701205 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8r7vt"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.718001 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f47dd5fdf-8bs76"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.720463 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.803877 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.804189 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-config-data\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.804673 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.804730 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-run-httpd\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.805211 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-run-httpd\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808067 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8cz\" (UniqueName: \"kubernetes.io/projected/56bc6065-f53f-4531-b18b-d7cab77a717b-kube-api-access-5m8cz\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808197 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee3395f1-0549-4bc4-a145-42ff20c37da6-logs\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808239 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-scripts\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808269 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56bc6065-f53f-4531-b18b-d7cab77a717b-horizon-secret-key\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808307 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txz9v\" (UniqueName: \"kubernetes.io/projected/4814501d-3b55-40bb-b932-41f91ca1d7fb-kube-api-access-txz9v\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808346 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-config-data\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808434 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-combined-ca-bundle\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808496 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bc6065-f53f-4531-b18b-d7cab77a717b-logs\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808541 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-db-sync-config-data\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808621 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-combined-ca-bundle\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808838 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmhm\" (UniqueName: \"kubernetes.io/projected/e1ba96b9-d556-419e-a8a3-f90348499977-kube-api-access-jjmhm\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808904 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cflqb\" (UniqueName: \"kubernetes.io/projected/ee3395f1-0549-4bc4-a145-42ff20c37da6-kube-api-access-cflqb\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808932 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-scripts\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.808994 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-log-httpd\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.809039 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-config-data\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.809150 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-scripts\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.817483 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-j54rq"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.820197 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-log-httpd\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.826957 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-scripts\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.827598 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.839515 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.840229 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-config-data\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.840581 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-db-sync-config-data\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.839527 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-combined-ca-bundle\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.842837 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txz9v\" (UniqueName: \"kubernetes.io/projected/4814501d-3b55-40bb-b932-41f91ca1d7fb-kube-api-access-txz9v\") pod \"barbican-db-sync-2cbqn\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.846624 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmhm\" (UniqueName: \"kubernetes.io/projected/e1ba96b9-d556-419e-a8a3-f90348499977-kube-api-access-jjmhm\") pod \"ceilometer-0\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.851811 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mjq5j"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.854014 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.879176 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mjq5j"] Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.906352 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.910862 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cflqb\" (UniqueName: \"kubernetes.io/projected/ee3395f1-0549-4bc4-a145-42ff20c37da6-kube-api-access-cflqb\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.910907 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kp2s\" (UniqueName: \"kubernetes.io/projected/1f2a482e-2660-474a-9e8f-28c2d9b75648-kube-api-access-7kp2s\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.910936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-scripts\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.910963 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-config-data\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.910985 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-config\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911025 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-config-data\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911049 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911086 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8cz\" (UniqueName: \"kubernetes.io/projected/56bc6065-f53f-4531-b18b-d7cab77a717b-kube-api-access-5m8cz\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911108 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911128 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee3395f1-0549-4bc4-a145-42ff20c37da6-logs\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911147 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-scripts\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911167 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56bc6065-f53f-4531-b18b-d7cab77a717b-horizon-secret-key\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911192 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911218 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-combined-ca-bundle\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911248 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bc6065-f53f-4531-b18b-d7cab77a717b-logs\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.911272 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.912216 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee3395f1-0549-4bc4-a145-42ff20c37da6-logs\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.913010 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-scripts\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.913523 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bc6065-f53f-4531-b18b-d7cab77a717b-logs\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.918781 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56bc6065-f53f-4531-b18b-d7cab77a717b-horizon-secret-key\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.920488 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-combined-ca-bundle\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.921366 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-config-data\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.923419 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-scripts\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.934276 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cflqb\" (UniqueName: \"kubernetes.io/projected/ee3395f1-0549-4bc4-a145-42ff20c37da6-kube-api-access-cflqb\") pod \"placement-db-sync-8r7vt\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.937005 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-config-data\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.937549 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8cz\" (UniqueName: \"kubernetes.io/projected/56bc6065-f53f-4531-b18b-d7cab77a717b-kube-api-access-5m8cz\") pod \"horizon-f47dd5fdf-8bs76\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:35 crc kubenswrapper[4669]: I1001 11:45:35.963625 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.013298 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.013407 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kp2s\" (UniqueName: \"kubernetes.io/projected/1f2a482e-2660-474a-9e8f-28c2d9b75648-kube-api-access-7kp2s\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.013442 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-config\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.013482 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.013509 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.013540 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.014572 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.014597 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.015230 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-config\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.015597 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.015818 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.027900 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.032566 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kp2s\" (UniqueName: \"kubernetes.io/projected/1f2a482e-2660-474a-9e8f-28c2d9b75648-kube-api-access-7kp2s\") pod \"dnsmasq-dns-76fcf4b695-mjq5j\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.068828 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r7vt" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.196405 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.245063 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-j54rq"] Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.250141 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zgw2m"] Oct 01 11:45:36 crc kubenswrapper[4669]: W1001 11:45:36.275470 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0d5aa67_b932_4e08_a6ca_2eb3b4c37b49.slice/crio-ae0cb75c341849b2973a1e0048026f758a1383ae104116178247a2eaafa183b3 WatchSource:0}: Error finding container ae0cb75c341849b2973a1e0048026f758a1383ae104116178247a2eaafa183b3: Status 404 returned error can't find the container with id ae0cb75c341849b2973a1e0048026f758a1383ae104116178247a2eaafa183b3 Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.414247 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7697d5fb49-4zsxr"] Oct 01 11:45:36 crc kubenswrapper[4669]: W1001 11:45:36.440978 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007c4768_f1c7_4750_a403_9a930798b8fb.slice/crio-7b89d2b6f4d8e2f0c39ac1f29dc0982fdc597d3af1881bc789d967d6287bab0f WatchSource:0}: Error finding container 7b89d2b6f4d8e2f0c39ac1f29dc0982fdc597d3af1881bc789d967d6287bab0f: Status 404 returned error can't find the container with id 7b89d2b6f4d8e2f0c39ac1f29dc0982fdc597d3af1881bc789d967d6287bab0f Oct 01 11:45:36 crc kubenswrapper[4669]: W1001 11:45:36.467553 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ba96b9_d556_419e_a8a3_f90348499977.slice/crio-fba6352b99c106c6d02c70921546ef2655086a2e387e4d8dda7182310ea81690 WatchSource:0}: Error finding container fba6352b99c106c6d02c70921546ef2655086a2e387e4d8dda7182310ea81690: Status 404 returned error can't find the container with id fba6352b99c106c6d02c70921546ef2655086a2e387e4d8dda7182310ea81690 Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.479165 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.604762 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2cbqn"] Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.637527 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8r7vt"] Oct 01 11:45:36 crc kubenswrapper[4669]: W1001 11:45:36.637847 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee3395f1_0549_4bc4_a145_42ff20c37da6.slice/crio-ddc34707d1c53d697793fb461c7cb3cda50233e9c42940eaca087dce4f957b9e WatchSource:0}: Error finding container ddc34707d1c53d697793fb461c7cb3cda50233e9c42940eaca087dce4f957b9e: Status 404 returned error can't find the container with id ddc34707d1c53d697793fb461c7cb3cda50233e9c42940eaca087dce4f957b9e Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.656279 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f47dd5fdf-8bs76"] Oct 01 11:45:36 crc kubenswrapper[4669]: W1001 11:45:36.690941 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f2a482e_2660_474a_9e8f_28c2d9b75648.slice/crio-eadfb62822dc286ff82b5130b1cb9eac90209a0a9253b36e0c732de9395cf0ae WatchSource:0}: Error finding container eadfb62822dc286ff82b5130b1cb9eac90209a0a9253b36e0c732de9395cf0ae: Status 404 returned error can't find the container with id eadfb62822dc286ff82b5130b1cb9eac90209a0a9253b36e0c732de9395cf0ae Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.696505 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mjq5j"] Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.880861 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f47dd5fdf-8bs76" event={"ID":"56bc6065-f53f-4531-b18b-d7cab77a717b","Type":"ContainerStarted","Data":"37f945371eeb8ce1cbc151205fa6f24aebe9c3004c85fa507315b7bad99f874d"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.882520 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2cbqn" event={"ID":"4814501d-3b55-40bb-b932-41f91ca1d7fb","Type":"ContainerStarted","Data":"00fbcfa86f2762c5fea4a5c1d0dc5fd6cf3611403d5d83d3d60cdf1908e465cb"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.885483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerStarted","Data":"fba6352b99c106c6d02c70921546ef2655086a2e387e4d8dda7182310ea81690"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.886676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" event={"ID":"1f2a482e-2660-474a-9e8f-28c2d9b75648","Type":"ContainerStarted","Data":"eadfb62822dc286ff82b5130b1cb9eac90209a0a9253b36e0c732de9395cf0ae"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.887996 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7697d5fb49-4zsxr" event={"ID":"007c4768-f1c7-4750-a403-9a930798b8fb","Type":"ContainerStarted","Data":"7b89d2b6f4d8e2f0c39ac1f29dc0982fdc597d3af1881bc789d967d6287bab0f"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.889860 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zgw2m" event={"ID":"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49","Type":"ContainerStarted","Data":"cf9ebe8ddf8119487d1976169a5e06441f5e00cce2b3cb03d7880e9fda69925c"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.889963 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zgw2m" event={"ID":"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49","Type":"ContainerStarted","Data":"ae0cb75c341849b2973a1e0048026f758a1383ae104116178247a2eaafa183b3"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.891104 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r7vt" event={"ID":"ee3395f1-0549-4bc4-a145-42ff20c37da6","Type":"ContainerStarted","Data":"ddc34707d1c53d697793fb461c7cb3cda50233e9c42940eaca087dce4f957b9e"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.892467 4669 generic.go:334] "Generic (PLEG): container finished" podID="d96401d3-0f9e-46ef-827e-28b42e2ca8d4" containerID="ee78a563314e8fe8377b0dc7525d20f9d7f2556f555da85dec3ef3c99f1d6e70" exitCode=0 Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.892507 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" event={"ID":"d96401d3-0f9e-46ef-827e-28b42e2ca8d4","Type":"ContainerDied","Data":"ee78a563314e8fe8377b0dc7525d20f9d7f2556f555da85dec3ef3c99f1d6e70"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.892527 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" event={"ID":"d96401d3-0f9e-46ef-827e-28b42e2ca8d4","Type":"ContainerStarted","Data":"71b00d6ad9d7f7536a9e01658a78267c0142c39ffaa446a986e3cda9383bd6a4"} Oct 01 11:45:36 crc kubenswrapper[4669]: I1001 11:45:36.913637 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zgw2m" podStartSLOduration=1.913610612 podStartE2EDuration="1.913610612s" podCreationTimestamp="2025-10-01 11:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:36.906899838 +0000 UTC m=+1028.006464815" watchObservedRunningTime="2025-10-01 11:45:36.913610612 +0000 UTC m=+1028.013175589" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.136110 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.250876 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-sb\") pod \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.250988 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-config\") pod \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.251197 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-swift-storage-0\") pod \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.251223 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-svc\") pod \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.251273 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk4d7\" (UniqueName: \"kubernetes.io/projected/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-kube-api-access-sk4d7\") pod \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.251333 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-nb\") pod \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\" (UID: \"d96401d3-0f9e-46ef-827e-28b42e2ca8d4\") " Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.280867 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-kube-api-access-sk4d7" (OuterVolumeSpecName: "kube-api-access-sk4d7") pod "d96401d3-0f9e-46ef-827e-28b42e2ca8d4" (UID: "d96401d3-0f9e-46ef-827e-28b42e2ca8d4"). InnerVolumeSpecName "kube-api-access-sk4d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.290123 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d96401d3-0f9e-46ef-827e-28b42e2ca8d4" (UID: "d96401d3-0f9e-46ef-827e-28b42e2ca8d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.290701 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-config" (OuterVolumeSpecName: "config") pod "d96401d3-0f9e-46ef-827e-28b42e2ca8d4" (UID: "d96401d3-0f9e-46ef-827e-28b42e2ca8d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.292851 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d96401d3-0f9e-46ef-827e-28b42e2ca8d4" (UID: "d96401d3-0f9e-46ef-827e-28b42e2ca8d4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.293216 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d96401d3-0f9e-46ef-827e-28b42e2ca8d4" (UID: "d96401d3-0f9e-46ef-827e-28b42e2ca8d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.303768 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d96401d3-0f9e-46ef-827e-28b42e2ca8d4" (UID: "d96401d3-0f9e-46ef-827e-28b42e2ca8d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.353582 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.353625 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.353639 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.353650 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk4d7\" (UniqueName: \"kubernetes.io/projected/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-kube-api-access-sk4d7\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.353659 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.353670 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d96401d3-0f9e-46ef-827e-28b42e2ca8d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.711790 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7697d5fb49-4zsxr"] Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.740282 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55c957f569-jgz5q"] Oct 01 11:45:37 crc kubenswrapper[4669]: E1001 11:45:37.741362 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96401d3-0f9e-46ef-827e-28b42e2ca8d4" containerName="init" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.741384 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96401d3-0f9e-46ef-827e-28b42e2ca8d4" containerName="init" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.741705 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96401d3-0f9e-46ef-827e-28b42e2ca8d4" containerName="init" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.759447 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.760737 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c957f569-jgz5q"] Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.767627 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4663989c-0e40-4edc-a036-87db51b6dd1f-logs\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.767753 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q527\" (UniqueName: \"kubernetes.io/projected/4663989c-0e40-4edc-a036-87db51b6dd1f-kube-api-access-9q527\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.767949 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-config-data\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.768025 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-scripts\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.768124 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4663989c-0e40-4edc-a036-87db51b6dd1f-horizon-secret-key\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.775571 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.869869 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4663989c-0e40-4edc-a036-87db51b6dd1f-horizon-secret-key\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.869940 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4663989c-0e40-4edc-a036-87db51b6dd1f-logs\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.870002 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q527\" (UniqueName: \"kubernetes.io/projected/4663989c-0e40-4edc-a036-87db51b6dd1f-kube-api-access-9q527\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.870053 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-config-data\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.870136 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-scripts\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.886283 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-scripts\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.887987 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4663989c-0e40-4edc-a036-87db51b6dd1f-logs\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.888938 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-config-data\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.899944 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4663989c-0e40-4edc-a036-87db51b6dd1f-horizon-secret-key\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.963879 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q527\" (UniqueName: \"kubernetes.io/projected/4663989c-0e40-4edc-a036-87db51b6dd1f-kube-api-access-9q527\") pod \"horizon-55c957f569-jgz5q\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.991733 4669 generic.go:334] "Generic (PLEG): container finished" podID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerID="7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33" exitCode=0 Oct 01 11:45:37 crc kubenswrapper[4669]: I1001 11:45:37.991808 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" event={"ID":"1f2a482e-2660-474a-9e8f-28c2d9b75648","Type":"ContainerDied","Data":"7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33"} Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.023289 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.024223 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-j54rq" event={"ID":"d96401d3-0f9e-46ef-827e-28b42e2ca8d4","Type":"ContainerDied","Data":"71b00d6ad9d7f7536a9e01658a78267c0142c39ffaa446a986e3cda9383bd6a4"} Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.024316 4669 scope.go:117] "RemoveContainer" containerID="ee78a563314e8fe8377b0dc7525d20f9d7f2556f555da85dec3ef3c99f1d6e70" Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.090457 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.173333 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-j54rq"] Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.181819 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-j54rq"] Oct 01 11:45:38 crc kubenswrapper[4669]: I1001 11:45:38.683800 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c957f569-jgz5q"] Oct 01 11:45:39 crc kubenswrapper[4669]: I1001 11:45:39.034233 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c957f569-jgz5q" event={"ID":"4663989c-0e40-4edc-a036-87db51b6dd1f","Type":"ContainerStarted","Data":"63c1bd1e4be46c0062ec1dace4a00584c329ed598adf9c4a3f213a656dd43317"} Oct 01 11:45:39 crc kubenswrapper[4669]: I1001 11:45:39.041397 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" event={"ID":"1f2a482e-2660-474a-9e8f-28c2d9b75648","Type":"ContainerStarted","Data":"e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da"} Oct 01 11:45:39 crc kubenswrapper[4669]: I1001 11:45:39.041628 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:39 crc kubenswrapper[4669]: I1001 11:45:39.074396 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" podStartSLOduration=4.074367358 podStartE2EDuration="4.074367358s" podCreationTimestamp="2025-10-01 11:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:39.066353262 +0000 UTC m=+1030.165918259" watchObservedRunningTime="2025-10-01 11:45:39.074367358 +0000 UTC m=+1030.173932335" Oct 01 11:45:39 crc kubenswrapper[4669]: I1001 11:45:39.663223 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96401d3-0f9e-46ef-827e-28b42e2ca8d4" path="/var/lib/kubelet/pods/d96401d3-0f9e-46ef-827e-28b42e2ca8d4/volumes" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.348468 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-h6rw6"] Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.350761 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.353065 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.353394 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.354370 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-h6rw6"] Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.355597 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-c5nsd" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.478116 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db2bb6cb-ab40-4534-967e-c71b62323512-etc-machine-id\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.478202 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-config-data\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.478226 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-scripts\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.478484 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-db-sync-config-data\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.478861 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv22s\" (UniqueName: \"kubernetes.io/projected/db2bb6cb-ab40-4534-967e-c71b62323512-kube-api-access-mv22s\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.479159 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-combined-ca-bundle\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.580907 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db2bb6cb-ab40-4534-967e-c71b62323512-etc-machine-id\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.580989 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-config-data\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.581011 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-scripts\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.581048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-db-sync-config-data\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.581130 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv22s\" (UniqueName: \"kubernetes.io/projected/db2bb6cb-ab40-4534-967e-c71b62323512-kube-api-access-mv22s\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.581179 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-combined-ca-bundle\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.582856 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db2bb6cb-ab40-4534-967e-c71b62323512-etc-machine-id\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.590842 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-db-sync-config-data\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.591379 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-config-data\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.590016 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-combined-ca-bundle\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.595750 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-scripts\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.604587 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv22s\" (UniqueName: \"kubernetes.io/projected/db2bb6cb-ab40-4534-967e-c71b62323512-kube-api-access-mv22s\") pod \"cinder-db-sync-h6rw6\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.696250 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.769975 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nfxsr"] Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.772299 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.775055 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.776636 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fwtz2" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.776943 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.784157 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nfxsr"] Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.886933 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-config\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.887023 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-combined-ca-bundle\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.887235 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpqt\" (UniqueName: \"kubernetes.io/projected/15da5802-a63f-44a7-b5b2-9f85b62e6675-kube-api-access-gtpqt\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.989971 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpqt\" (UniqueName: \"kubernetes.io/projected/15da5802-a63f-44a7-b5b2-9f85b62e6675-kube-api-access-gtpqt\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.990165 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-config\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.990278 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-combined-ca-bundle\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:40 crc kubenswrapper[4669]: I1001 11:45:40.999674 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-config\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:41 crc kubenswrapper[4669]: I1001 11:45:41.007064 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-combined-ca-bundle\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:41 crc kubenswrapper[4669]: I1001 11:45:41.011352 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpqt\" (UniqueName: \"kubernetes.io/projected/15da5802-a63f-44a7-b5b2-9f85b62e6675-kube-api-access-gtpqt\") pod \"neutron-db-sync-nfxsr\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:41 crc kubenswrapper[4669]: I1001 11:45:41.081938 4669 generic.go:334] "Generic (PLEG): container finished" podID="f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" containerID="cf9ebe8ddf8119487d1976169a5e06441f5e00cce2b3cb03d7880e9fda69925c" exitCode=0 Oct 01 11:45:41 crc kubenswrapper[4669]: I1001 11:45:41.082005 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zgw2m" event={"ID":"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49","Type":"ContainerDied","Data":"cf9ebe8ddf8119487d1976169a5e06441f5e00cce2b3cb03d7880e9fda69925c"} Oct 01 11:45:41 crc kubenswrapper[4669]: I1001 11:45:41.100913 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.017259 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f47dd5fdf-8bs76"] Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.028460 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-866c85f5d8-mvd64"] Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.030940 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.036021 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-866c85f5d8-mvd64"] Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.038993 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.119372 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c957f569-jgz5q"] Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.151820 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74d4dc5744-kqwsh"] Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.153608 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.165954 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-config-data\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.166009 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dab5a8-a8e3-4496-8187-089069b8e14f-logs\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.166054 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-combined-ca-bundle\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.166129 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-scripts\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.166160 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-tls-certs\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.166183 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bc7\" (UniqueName: \"kubernetes.io/projected/62dab5a8-a8e3-4496-8187-089069b8e14f-kube-api-access-r2bc7\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.166212 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-secret-key\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.178044 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74d4dc5744-kqwsh"] Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267680 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/050a3c50-c6fb-4371-a309-af03e288d70d-scripts\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267751 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-combined-ca-bundle\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267782 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/050a3c50-c6fb-4371-a309-af03e288d70d-logs\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267837 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-scripts\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267867 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-tls-certs\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267893 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bc7\" (UniqueName: \"kubernetes.io/projected/62dab5a8-a8e3-4496-8187-089069b8e14f-kube-api-access-r2bc7\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267915 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-horizon-tls-certs\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267948 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h5c6\" (UniqueName: \"kubernetes.io/projected/050a3c50-c6fb-4371-a309-af03e288d70d-kube-api-access-6h5c6\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.267968 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-secret-key\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.268013 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-horizon-secret-key\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.268034 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/050a3c50-c6fb-4371-a309-af03e288d70d-config-data\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.268069 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-config-data\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.268108 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dab5a8-a8e3-4496-8187-089069b8e14f-logs\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.268635 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-scripts\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.269693 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dab5a8-a8e3-4496-8187-089069b8e14f-logs\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.269893 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-config-data\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.269917 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-combined-ca-bundle\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.275781 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-combined-ca-bundle\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.275847 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-secret-key\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.276413 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-tls-certs\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.287152 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bc7\" (UniqueName: \"kubernetes.io/projected/62dab5a8-a8e3-4496-8187-089069b8e14f-kube-api-access-r2bc7\") pod \"horizon-866c85f5d8-mvd64\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.359948 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372034 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-horizon-secret-key\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372131 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/050a3c50-c6fb-4371-a309-af03e288d70d-config-data\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372201 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/050a3c50-c6fb-4371-a309-af03e288d70d-scripts\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372237 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/050a3c50-c6fb-4371-a309-af03e288d70d-logs\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372263 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-combined-ca-bundle\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372315 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-horizon-tls-certs\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.372349 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h5c6\" (UniqueName: \"kubernetes.io/projected/050a3c50-c6fb-4371-a309-af03e288d70d-kube-api-access-6h5c6\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.373190 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/050a3c50-c6fb-4371-a309-af03e288d70d-logs\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.373277 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/050a3c50-c6fb-4371-a309-af03e288d70d-scripts\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.374755 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/050a3c50-c6fb-4371-a309-af03e288d70d-config-data\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.377687 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-horizon-secret-key\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.383841 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-combined-ca-bundle\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.384414 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a3c50-c6fb-4371-a309-af03e288d70d-horizon-tls-certs\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.390513 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h5c6\" (UniqueName: \"kubernetes.io/projected/050a3c50-c6fb-4371-a309-af03e288d70d-kube-api-access-6h5c6\") pod \"horizon-74d4dc5744-kqwsh\" (UID: \"050a3c50-c6fb-4371-a309-af03e288d70d\") " pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:44 crc kubenswrapper[4669]: I1001 11:45:44.471575 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:45:46 crc kubenswrapper[4669]: I1001 11:45:46.199403 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:45:46 crc kubenswrapper[4669]: I1001 11:45:46.282480 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-nwdj2"] Oct 01 11:45:46 crc kubenswrapper[4669]: I1001 11:45:46.282901 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="dnsmasq-dns" containerID="cri-o://a6ba5796a6d3416538a0af7dc289a2bc98f52a02d5a2bc45099a7dc04de54970" gracePeriod=10 Oct 01 11:45:47 crc kubenswrapper[4669]: I1001 11:45:47.161417 4669 generic.go:334] "Generic (PLEG): container finished" podID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerID="a6ba5796a6d3416538a0af7dc289a2bc98f52a02d5a2bc45099a7dc04de54970" exitCode=0 Oct 01 11:45:47 crc kubenswrapper[4669]: I1001 11:45:47.161488 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" event={"ID":"941f43bf-37b4-451f-a1e9-53ebcbebd0f1","Type":"ContainerDied","Data":"a6ba5796a6d3416538a0af7dc289a2bc98f52a02d5a2bc45099a7dc04de54970"} Oct 01 11:45:50 crc kubenswrapper[4669]: I1001 11:45:50.058899 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Oct 01 11:45:52 crc kubenswrapper[4669]: E1001 11:45:52.055451 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 01 11:45:52 crc kubenswrapper[4669]: E1001 11:45:52.056172 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cflqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-8r7vt_openstack(ee3395f1-0549-4bc4-a145-42ff20c37da6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:45:52 crc kubenswrapper[4669]: E1001 11:45:52.057711 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-8r7vt" podUID="ee3395f1-0549-4bc4-a145-42ff20c37da6" Oct 01 11:45:52 crc kubenswrapper[4669]: E1001 11:45:52.244473 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-8r7vt" podUID="ee3395f1-0549-4bc4-a145-42ff20c37da6" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.016800 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.095956 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-fernet-keys\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.096170 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-config-data\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.096317 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srwn8\" (UniqueName: \"kubernetes.io/projected/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-kube-api-access-srwn8\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.096371 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-credential-keys\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.096408 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-scripts\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.096562 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.104836 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.106632 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.107426 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-scripts" (OuterVolumeSpecName: "scripts") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.115583 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-kube-api-access-srwn8" (OuterVolumeSpecName: "kube-api-access-srwn8") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49"). InnerVolumeSpecName "kube-api-access-srwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:54 crc kubenswrapper[4669]: E1001 11:45:54.131443 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle podName:f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49 nodeName:}" failed. No retries permitted until 2025-10-01 11:45:54.631397338 +0000 UTC m=+1045.730962315 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49") : error deleting /var/lib/kubelet/pods/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49/volume-subpaths: remove /var/lib/kubelet/pods/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49/volume-subpaths: no such file or directory Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.144160 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-config-data" (OuterVolumeSpecName: "config-data") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.199648 4669 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.199696 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.199725 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srwn8\" (UniqueName: \"kubernetes.io/projected/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-kube-api-access-srwn8\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.199740 4669 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.199749 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.261651 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zgw2m" event={"ID":"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49","Type":"ContainerDied","Data":"ae0cb75c341849b2973a1e0048026f758a1383ae104116178247a2eaafa183b3"} Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.261741 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0cb75c341849b2973a1e0048026f758a1383ae104116178247a2eaafa183b3" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.261820 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zgw2m" Oct 01 11:45:54 crc kubenswrapper[4669]: E1001 11:45:54.577637 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 01 11:45:54 crc kubenswrapper[4669]: E1001 11:45:54.579202 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txz9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-2cbqn_openstack(4814501d-3b55-40bb-b932-41f91ca1d7fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:45:54 crc kubenswrapper[4669]: E1001 11:45:54.581259 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-2cbqn" podUID="4814501d-3b55-40bb-b932-41f91ca1d7fb" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.711583 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle\") pod \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\" (UID: \"f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49\") " Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.717436 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" (UID: "f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:45:54 crc kubenswrapper[4669]: I1001 11:45:54.815292 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: E1001 11:45:55.165442 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 01 11:45:55 crc kubenswrapper[4669]: E1001 11:45:55.166147 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h69h6h5c8h59fhc4h579h7h5bbh556hf9hcdh54ch699h55ch554h5bdhd5h58fh79h648h59bh58fh68h5f9h658hd4h547h644h687h669h5f9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jjmhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e1ba96b9-d556-419e-a8a3-f90348499977): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.227242 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zgw2m"] Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.237830 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zgw2m"] Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.297694 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" event={"ID":"941f43bf-37b4-451f-a1e9-53ebcbebd0f1","Type":"ContainerDied","Data":"ac30adf3cd7a90b7e445df7c0294c829674719d40c1e6d0bbc45eef04bbe15b9"} Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.297765 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac30adf3cd7a90b7e445df7c0294c829674719d40c1e6d0bbc45eef04bbe15b9" Oct 01 11:45:55 crc kubenswrapper[4669]: E1001 11:45:55.301499 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-2cbqn" podUID="4814501d-3b55-40bb-b932-41f91ca1d7fb" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.355531 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-997b7"] Oct 01 11:45:55 crc kubenswrapper[4669]: E1001 11:45:55.356609 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" containerName="keystone-bootstrap" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.356641 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" containerName="keystone-bootstrap" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.356875 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" containerName="keystone-bootstrap" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.357771 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.365593 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.365618 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.365637 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.366190 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fz8wd" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.381833 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-997b7"] Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.434939 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-combined-ca-bundle\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.434996 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-scripts\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.435162 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-credential-keys\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.435233 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf2q7\" (UniqueName: \"kubernetes.io/projected/55a28038-27bc-4a9f-be99-657225a3b9e5-kube-api-access-xf2q7\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.435271 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-fernet-keys\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.435297 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-config-data\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.467840 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.536994 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-nb\") pod \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.537173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-swift-storage-0\") pod \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.537260 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-config\") pod \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.537311 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-svc\") pod \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.537826 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-sb\") pod \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.537917 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hvqx\" (UniqueName: \"kubernetes.io/projected/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-kube-api-access-8hvqx\") pod \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\" (UID: \"941f43bf-37b4-451f-a1e9-53ebcbebd0f1\") " Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.538562 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-credential-keys\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.538672 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf2q7\" (UniqueName: \"kubernetes.io/projected/55a28038-27bc-4a9f-be99-657225a3b9e5-kube-api-access-xf2q7\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.538722 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-fernet-keys\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.538880 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-config-data\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.539000 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-combined-ca-bundle\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.539107 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-scripts\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.551491 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-combined-ca-bundle\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.551791 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-fernet-keys\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.562978 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-scripts\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.565995 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-kube-api-access-8hvqx" (OuterVolumeSpecName: "kube-api-access-8hvqx") pod "941f43bf-37b4-451f-a1e9-53ebcbebd0f1" (UID: "941f43bf-37b4-451f-a1e9-53ebcbebd0f1"). InnerVolumeSpecName "kube-api-access-8hvqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.567365 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-credential-keys\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.572275 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf2q7\" (UniqueName: \"kubernetes.io/projected/55a28038-27bc-4a9f-be99-657225a3b9e5-kube-api-access-xf2q7\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.574726 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-config-data\") pod \"keystone-bootstrap-997b7\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.642174 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hvqx\" (UniqueName: \"kubernetes.io/projected/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-kube-api-access-8hvqx\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.659189 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49" path="/var/lib/kubelet/pods/f0d5aa67-b932-4e08-a6ca-2eb3b4c37b49/volumes" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.718268 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74d4dc5744-kqwsh"] Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.725209 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-config" (OuterVolumeSpecName: "config") pod "941f43bf-37b4-451f-a1e9-53ebcbebd0f1" (UID: "941f43bf-37b4-451f-a1e9-53ebcbebd0f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.734917 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "941f43bf-37b4-451f-a1e9-53ebcbebd0f1" (UID: "941f43bf-37b4-451f-a1e9-53ebcbebd0f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:55 crc kubenswrapper[4669]: W1001 11:45:55.737997 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod050a3c50_c6fb_4371_a309_af03e288d70d.slice/crio-7d85cdc4e1d46490ca96236459f444f387356ba62818c8e799e9a0e9e40c7840 WatchSource:0}: Error finding container 7d85cdc4e1d46490ca96236459f444f387356ba62818c8e799e9a0e9e40c7840: Status 404 returned error can't find the container with id 7d85cdc4e1d46490ca96236459f444f387356ba62818c8e799e9a0e9e40c7840 Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.742673 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "941f43bf-37b4-451f-a1e9-53ebcbebd0f1" (UID: "941f43bf-37b4-451f-a1e9-53ebcbebd0f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.744298 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.744322 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.744334 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.758503 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "941f43bf-37b4-451f-a1e9-53ebcbebd0f1" (UID: "941f43bf-37b4-451f-a1e9-53ebcbebd0f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.780543 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "941f43bf-37b4-451f-a1e9-53ebcbebd0f1" (UID: "941f43bf-37b4-451f-a1e9-53ebcbebd0f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.795761 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nfxsr"] Oct 01 11:45:55 crc kubenswrapper[4669]: W1001 11:45:55.810045 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15da5802_a63f_44a7_b5b2_9f85b62e6675.slice/crio-7200a839fb92a77877c931470b817cc1470627ed6dcafee8c07f0bb29e2076f9 WatchSource:0}: Error finding container 7200a839fb92a77877c931470b817cc1470627ed6dcafee8c07f0bb29e2076f9: Status 404 returned error can't find the container with id 7200a839fb92a77877c931470b817cc1470627ed6dcafee8c07f0bb29e2076f9 Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.831446 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-997b7" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.846303 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.846336 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/941f43bf-37b4-451f-a1e9-53ebcbebd0f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.880134 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-h6rw6"] Oct 01 11:45:55 crc kubenswrapper[4669]: I1001 11:45:55.889235 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-866c85f5d8-mvd64"] Oct 01 11:45:55 crc kubenswrapper[4669]: W1001 11:45:55.919965 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb2bb6cb_ab40_4534_967e_c71b62323512.slice/crio-4e4194e0182535e53e4af094857a4c8aca1f483a1262bbeb8b301d806b6a2ccc WatchSource:0}: Error finding container 4e4194e0182535e53e4af094857a4c8aca1f483a1262bbeb8b301d806b6a2ccc: Status 404 returned error can't find the container with id 4e4194e0182535e53e4af094857a4c8aca1f483a1262bbeb8b301d806b6a2ccc Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.321420 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h6rw6" event={"ID":"db2bb6cb-ab40-4534-967e-c71b62323512","Type":"ContainerStarted","Data":"4e4194e0182535e53e4af094857a4c8aca1f483a1262bbeb8b301d806b6a2ccc"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.326243 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nfxsr" event={"ID":"15da5802-a63f-44a7-b5b2-9f85b62e6675","Type":"ContainerStarted","Data":"7200a839fb92a77877c931470b817cc1470627ed6dcafee8c07f0bb29e2076f9"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.338554 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s89kf" event={"ID":"6c85d289-ff7f-4b57-a54a-cb272dec58e2","Type":"ContainerStarted","Data":"609b41669b067291745a6bbf95e71d443920b227d4e31771406a5637776767ba"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.343017 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7697d5fb49-4zsxr" event={"ID":"007c4768-f1c7-4750-a403-9a930798b8fb","Type":"ContainerStarted","Data":"878df99570804bbb4ca83d6b364f11a1580d55b0999fc8235a0bdc70ec78ab35"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.343224 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7697d5fb49-4zsxr" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon-log" containerID="cri-o://878df99570804bbb4ca83d6b364f11a1580d55b0999fc8235a0bdc70ec78ab35" gracePeriod=30 Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.343339 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7697d5fb49-4zsxr" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon" containerID="cri-o://77e8fd6f30440d5c23e630e598d5127008ce249f884e507c028aad52ab19fb53" gracePeriod=30 Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.351465 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nfxsr" podStartSLOduration=16.351447609 podStartE2EDuration="16.351447609s" podCreationTimestamp="2025-10-01 11:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:56.345454494 +0000 UTC m=+1047.445019471" watchObservedRunningTime="2025-10-01 11:45:56.351447609 +0000 UTC m=+1047.451012586" Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.365629 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d4dc5744-kqwsh" event={"ID":"050a3c50-c6fb-4371-a309-af03e288d70d","Type":"ContainerStarted","Data":"31dca0d605fb36513145eb8d8496362927bd4c425c5b4d942b129bf933f37b14"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.365685 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d4dc5744-kqwsh" event={"ID":"050a3c50-c6fb-4371-a309-af03e288d70d","Type":"ContainerStarted","Data":"7d85cdc4e1d46490ca96236459f444f387356ba62818c8e799e9a0e9e40c7840"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.368308 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c957f569-jgz5q" event={"ID":"4663989c-0e40-4edc-a036-87db51b6dd1f","Type":"ContainerStarted","Data":"f50645d7f8a1075223cdc0cd5c60f3fb85a492b6dcf3c7166e5ef9531213c34c"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.368346 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c957f569-jgz5q" event={"ID":"4663989c-0e40-4edc-a036-87db51b6dd1f","Type":"ContainerStarted","Data":"428d88cb406c771e4b1e4d9dcd129cde4c277cf86afae6378484d7c0dac94377"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.368502 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55c957f569-jgz5q" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon-log" containerID="cri-o://428d88cb406c771e4b1e4d9dcd129cde4c277cf86afae6378484d7c0dac94377" gracePeriod=30 Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.369047 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55c957f569-jgz5q" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon" containerID="cri-o://f50645d7f8a1075223cdc0cd5c60f3fb85a492b6dcf3c7166e5ef9531213c34c" gracePeriod=30 Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.378883 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7697d5fb49-4zsxr" podStartSLOduration=2.660959273 podStartE2EDuration="21.378864818s" podCreationTimestamp="2025-10-01 11:45:35 +0000 UTC" firstStartedPulling="2025-10-01 11:45:36.445370554 +0000 UTC m=+1027.544935531" lastFinishedPulling="2025-10-01 11:45:55.163276089 +0000 UTC m=+1046.262841076" observedRunningTime="2025-10-01 11:45:56.375497597 +0000 UTC m=+1047.475062574" watchObservedRunningTime="2025-10-01 11:45:56.378864818 +0000 UTC m=+1047.478429795" Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.382801 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c85f5d8-mvd64" event={"ID":"62dab5a8-a8e3-4496-8187-089069b8e14f","Type":"ContainerStarted","Data":"7a9a633ab1c7f42ed47d2997d9dc885d00311674cc27bac51f072d45c539d89b"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.390626 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.390867 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f47dd5fdf-8bs76" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon" containerID="cri-o://d26c4fb93d9265472f91f0f34dad72565a4d975b3d9bd64355d7c695d7486d3d" gracePeriod=30 Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.390976 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f47dd5fdf-8bs76" event={"ID":"56bc6065-f53f-4531-b18b-d7cab77a717b","Type":"ContainerStarted","Data":"d26c4fb93d9265472f91f0f34dad72565a4d975b3d9bd64355d7c695d7486d3d"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.391020 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f47dd5fdf-8bs76" event={"ID":"56bc6065-f53f-4531-b18b-d7cab77a717b","Type":"ContainerStarted","Data":"7589ef84f4911b8fd022050a72b62ec491e1d2d9147045a7f2a7a5b07a4c7a21"} Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.390618 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f47dd5fdf-8bs76" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon-log" containerID="cri-o://7589ef84f4911b8fd022050a72b62ec491e1d2d9147045a7f2a7a5b07a4c7a21" gracePeriod=30 Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.417281 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s89kf" podStartSLOduration=5.778364661 podStartE2EDuration="53.417262084s" podCreationTimestamp="2025-10-01 11:45:03 +0000 UTC" firstStartedPulling="2025-10-01 11:45:04.455358983 +0000 UTC m=+995.554923980" lastFinishedPulling="2025-10-01 11:45:52.094256426 +0000 UTC m=+1043.193821403" observedRunningTime="2025-10-01 11:45:56.39575279 +0000 UTC m=+1047.495317767" watchObservedRunningTime="2025-10-01 11:45:56.417262084 +0000 UTC m=+1047.516827061" Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.436177 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-997b7"] Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.441501 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f47dd5fdf-8bs76" podStartSLOduration=2.891191408 podStartE2EDuration="21.441478865s" podCreationTimestamp="2025-10-01 11:45:35 +0000 UTC" firstStartedPulling="2025-10-01 11:45:36.684243209 +0000 UTC m=+1027.783808186" lastFinishedPulling="2025-10-01 11:45:55.234530666 +0000 UTC m=+1046.334095643" observedRunningTime="2025-10-01 11:45:56.425660749 +0000 UTC m=+1047.525225736" watchObservedRunningTime="2025-10-01 11:45:56.441478865 +0000 UTC m=+1047.541043842" Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.451646 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55c957f569-jgz5q" podStartSLOduration=2.915170599 podStartE2EDuration="19.451592212s" podCreationTimestamp="2025-10-01 11:45:37 +0000 UTC" firstStartedPulling="2025-10-01 11:45:38.691184774 +0000 UTC m=+1029.790749751" lastFinishedPulling="2025-10-01 11:45:55.227606387 +0000 UTC m=+1046.327171364" observedRunningTime="2025-10-01 11:45:56.449649185 +0000 UTC m=+1047.549214162" watchObservedRunningTime="2025-10-01 11:45:56.451592212 +0000 UTC m=+1047.551157179" Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.518729 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-nwdj2"] Oct 01 11:45:56 crc kubenswrapper[4669]: I1001 11:45:56.531190 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-nwdj2"] Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.401548 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-997b7" event={"ID":"55a28038-27bc-4a9f-be99-657225a3b9e5","Type":"ContainerStarted","Data":"3fefcc43736ea6e549f0effcb874d8eaf368f5dc8c96e5a31de2e1e49dab6292"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.401914 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-997b7" event={"ID":"55a28038-27bc-4a9f-be99-657225a3b9e5","Type":"ContainerStarted","Data":"15de6136381e63f08e8322b7ad7506aa200dcce481848b9834755bc8192a5314"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.405352 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7697d5fb49-4zsxr" event={"ID":"007c4768-f1c7-4750-a403-9a930798b8fb","Type":"ContainerStarted","Data":"77e8fd6f30440d5c23e630e598d5127008ce249f884e507c028aad52ab19fb53"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.430738 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-997b7" podStartSLOduration=2.430711216 podStartE2EDuration="2.430711216s" podCreationTimestamp="2025-10-01 11:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:57.426202196 +0000 UTC m=+1048.525767183" watchObservedRunningTime="2025-10-01 11:45:57.430711216 +0000 UTC m=+1048.530276193" Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.441414 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d4dc5744-kqwsh" event={"ID":"050a3c50-c6fb-4371-a309-af03e288d70d","Type":"ContainerStarted","Data":"5b43b6007bad09d60f44f8aa5278948d20669e0096921eecb9893cae43382b97"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.459179 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c85f5d8-mvd64" event={"ID":"62dab5a8-a8e3-4496-8187-089069b8e14f","Type":"ContainerStarted","Data":"6836b5a3a12e44f349fe24052dfc7816b67cc72171ee1b8dc050d26ad2b5f3bc"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.459233 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c85f5d8-mvd64" event={"ID":"62dab5a8-a8e3-4496-8187-089069b8e14f","Type":"ContainerStarted","Data":"e291c34c35b2e8ea3b830f371585d97db07df75e845acb5850aa9ed5690727d9"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.482026 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-74d4dc5744-kqwsh" podStartSLOduration=13.482000977 podStartE2EDuration="13.482000977s" podCreationTimestamp="2025-10-01 11:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:57.476122764 +0000 UTC m=+1048.575687741" watchObservedRunningTime="2025-10-01 11:45:57.482000977 +0000 UTC m=+1048.581565954" Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.483853 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nfxsr" event={"ID":"15da5802-a63f-44a7-b5b2-9f85b62e6675","Type":"ContainerStarted","Data":"0ae6a7126e7381ab1718ccd6b3e2637a6ce2cd81c70734a872755d4a57a58bd9"} Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.502568 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-866c85f5d8-mvd64" podStartSLOduration=14.502541768 podStartE2EDuration="14.502541768s" podCreationTimestamp="2025-10-01 11:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:45:57.501787019 +0000 UTC m=+1048.601351996" watchObservedRunningTime="2025-10-01 11:45:57.502541768 +0000 UTC m=+1048.602106745" Oct 01 11:45:57 crc kubenswrapper[4669]: I1001 11:45:57.657253 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" path="/var/lib/kubelet/pods/941f43bf-37b4-451f-a1e9-53ebcbebd0f1/volumes" Oct 01 11:45:58 crc kubenswrapper[4669]: I1001 11:45:58.091400 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:45:59 crc kubenswrapper[4669]: I1001 11:45:59.536343 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerStarted","Data":"665f62049f18fe7951920d0471c64261f10054396f5f92b033f79ab723ab1d3c"} Oct 01 11:46:00 crc kubenswrapper[4669]: I1001 11:46:00.058889 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-nwdj2" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Oct 01 11:46:01 crc kubenswrapper[4669]: I1001 11:46:01.863798 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:46:01 crc kubenswrapper[4669]: I1001 11:46:01.864271 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:46:01 crc kubenswrapper[4669]: I1001 11:46:01.864345 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:46:01 crc kubenswrapper[4669]: I1001 11:46:01.865264 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef1fa470dbb217bde08acd53a153a9e8382565310fe4c3c6cd2c78b6a193aa31"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:46:01 crc kubenswrapper[4669]: I1001 11:46:01.865330 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://ef1fa470dbb217bde08acd53a153a9e8382565310fe4c3c6cd2c78b6a193aa31" gracePeriod=600 Oct 01 11:46:02 crc kubenswrapper[4669]: I1001 11:46:02.593503 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="ef1fa470dbb217bde08acd53a153a9e8382565310fe4c3c6cd2c78b6a193aa31" exitCode=0 Oct 01 11:46:02 crc kubenswrapper[4669]: I1001 11:46:02.593557 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"ef1fa470dbb217bde08acd53a153a9e8382565310fe4c3c6cd2c78b6a193aa31"} Oct 01 11:46:02 crc kubenswrapper[4669]: I1001 11:46:02.593979 4669 scope.go:117] "RemoveContainer" containerID="86579f99b2d7fdefab555c5926d95a0899a74cade0993be4e08705b39fe0421d" Oct 01 11:46:03 crc kubenswrapper[4669]: I1001 11:46:03.608679 4669 generic.go:334] "Generic (PLEG): container finished" podID="55a28038-27bc-4a9f-be99-657225a3b9e5" containerID="3fefcc43736ea6e549f0effcb874d8eaf368f5dc8c96e5a31de2e1e49dab6292" exitCode=0 Oct 01 11:46:03 crc kubenswrapper[4669]: I1001 11:46:03.608750 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-997b7" event={"ID":"55a28038-27bc-4a9f-be99-657225a3b9e5","Type":"ContainerDied","Data":"3fefcc43736ea6e549f0effcb874d8eaf368f5dc8c96e5a31de2e1e49dab6292"} Oct 01 11:46:04 crc kubenswrapper[4669]: I1001 11:46:04.360360 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:46:04 crc kubenswrapper[4669]: I1001 11:46:04.361000 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:46:04 crc kubenswrapper[4669]: I1001 11:46:04.474529 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:46:04 crc kubenswrapper[4669]: I1001 11:46:04.477177 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:46:05 crc kubenswrapper[4669]: I1001 11:46:05.721196 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:46:06 crc kubenswrapper[4669]: I1001 11:46:06.028920 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:46:14 crc kubenswrapper[4669]: I1001 11:46:14.363105 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-866c85f5d8-mvd64" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 11:46:14 crc kubenswrapper[4669]: I1001 11:46:14.476860 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-74d4dc5744-kqwsh" podUID="050a3c50-c6fb-4371-a309-af03e288d70d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.515473 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-997b7" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.681142 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf2q7\" (UniqueName: \"kubernetes.io/projected/55a28038-27bc-4a9f-be99-657225a3b9e5-kube-api-access-xf2q7\") pod \"55a28038-27bc-4a9f-be99-657225a3b9e5\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.681279 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-scripts\") pod \"55a28038-27bc-4a9f-be99-657225a3b9e5\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.681308 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-credential-keys\") pod \"55a28038-27bc-4a9f-be99-657225a3b9e5\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.681336 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-fernet-keys\") pod \"55a28038-27bc-4a9f-be99-657225a3b9e5\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.681388 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-combined-ca-bundle\") pod \"55a28038-27bc-4a9f-be99-657225a3b9e5\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.681412 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-config-data\") pod \"55a28038-27bc-4a9f-be99-657225a3b9e5\" (UID: \"55a28038-27bc-4a9f-be99-657225a3b9e5\") " Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.690192 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "55a28038-27bc-4a9f-be99-657225a3b9e5" (UID: "55a28038-27bc-4a9f-be99-657225a3b9e5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.690936 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a28038-27bc-4a9f-be99-657225a3b9e5-kube-api-access-xf2q7" (OuterVolumeSpecName: "kube-api-access-xf2q7") pod "55a28038-27bc-4a9f-be99-657225a3b9e5" (UID: "55a28038-27bc-4a9f-be99-657225a3b9e5"). InnerVolumeSpecName "kube-api-access-xf2q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.693282 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-scripts" (OuterVolumeSpecName: "scripts") pod "55a28038-27bc-4a9f-be99-657225a3b9e5" (UID: "55a28038-27bc-4a9f-be99-657225a3b9e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.696642 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "55a28038-27bc-4a9f-be99-657225a3b9e5" (UID: "55a28038-27bc-4a9f-be99-657225a3b9e5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.723056 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55a28038-27bc-4a9f-be99-657225a3b9e5" (UID: "55a28038-27bc-4a9f-be99-657225a3b9e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.735162 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-config-data" (OuterVolumeSpecName: "config-data") pod "55a28038-27bc-4a9f-be99-657225a3b9e5" (UID: "55a28038-27bc-4a9f-be99-657225a3b9e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.755556 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-997b7" event={"ID":"55a28038-27bc-4a9f-be99-657225a3b9e5","Type":"ContainerDied","Data":"15de6136381e63f08e8322b7ad7506aa200dcce481848b9834755bc8192a5314"} Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.755611 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15de6136381e63f08e8322b7ad7506aa200dcce481848b9834755bc8192a5314" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.755703 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-997b7" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.784105 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.784158 4669 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.784170 4669 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.784180 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.784189 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a28038-27bc-4a9f-be99-657225a3b9e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:16 crc kubenswrapper[4669]: I1001 11:46:16.784199 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf2q7\" (UniqueName: \"kubernetes.io/projected/55a28038-27bc-4a9f-be99-657225a3b9e5-kube-api-access-xf2q7\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.684106 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d99769bb4-lq4fx"] Oct 01 11:46:17 crc kubenswrapper[4669]: E1001 11:46:17.684874 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="init" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.684890 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="init" Oct 01 11:46:17 crc kubenswrapper[4669]: E1001 11:46:17.684917 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a28038-27bc-4a9f-be99-657225a3b9e5" containerName="keystone-bootstrap" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.684924 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a28038-27bc-4a9f-be99-657225a3b9e5" containerName="keystone-bootstrap" Oct 01 11:46:17 crc kubenswrapper[4669]: E1001 11:46:17.684941 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="dnsmasq-dns" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.684946 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="dnsmasq-dns" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.685789 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="941f43bf-37b4-451f-a1e9-53ebcbebd0f1" containerName="dnsmasq-dns" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.685812 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a28038-27bc-4a9f-be99-657225a3b9e5" containerName="keystone-bootstrap" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.686596 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.691155 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.691460 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.691641 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.695691 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.695702 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fz8wd" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.697175 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d99769bb4-lq4fx"] Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.704006 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.808771 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-config-data\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809114 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-credential-keys\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809261 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-scripts\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809309 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-combined-ca-bundle\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809338 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-public-tls-certs\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809379 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-fernet-keys\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809417 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-internal-tls-certs\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.809464 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ns9v\" (UniqueName: \"kubernetes.io/projected/85b6fded-ed15-47f3-8e06-23511061f9b1-kube-api-access-7ns9v\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911486 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-config-data\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911582 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-credential-keys\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911640 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-scripts\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911666 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-combined-ca-bundle\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911688 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-public-tls-certs\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911739 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-fernet-keys\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911780 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-internal-tls-certs\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.911829 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ns9v\" (UniqueName: \"kubernetes.io/projected/85b6fded-ed15-47f3-8e06-23511061f9b1-kube-api-access-7ns9v\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.918791 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-scripts\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.919046 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-combined-ca-bundle\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.921674 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-internal-tls-certs\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.921887 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-config-data\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.925649 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-public-tls-certs\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.925872 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-fernet-keys\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.926039 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/85b6fded-ed15-47f3-8e06-23511061f9b1-credential-keys\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:17 crc kubenswrapper[4669]: I1001 11:46:17.931581 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ns9v\" (UniqueName: \"kubernetes.io/projected/85b6fded-ed15-47f3-8e06-23511061f9b1-kube-api-access-7ns9v\") pod \"keystone-5d99769bb4-lq4fx\" (UID: \"85b6fded-ed15-47f3-8e06-23511061f9b1\") " pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.029711 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:18 crc kubenswrapper[4669]: E1001 11:46:18.095713 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 01 11:46:18 crc kubenswrapper[4669]: E1001 11:46:18.096468 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv22s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-h6rw6_openstack(db2bb6cb-ab40-4534-967e-c71b62323512): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:46:18 crc kubenswrapper[4669]: E1001 11:46:18.097672 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-h6rw6" podUID="db2bb6cb-ab40-4534-967e-c71b62323512" Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.679904 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d99769bb4-lq4fx"] Oct 01 11:46:18 crc kubenswrapper[4669]: W1001 11:46:18.682273 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b6fded_ed15_47f3_8e06_23511061f9b1.slice/crio-a99dca2f974fc03fdf3d57bbdcfc2eb101884bf9ed3d9b12448840ae96bb9940 WatchSource:0}: Error finding container a99dca2f974fc03fdf3d57bbdcfc2eb101884bf9ed3d9b12448840ae96bb9940: Status 404 returned error can't find the container with id a99dca2f974fc03fdf3d57bbdcfc2eb101884bf9ed3d9b12448840ae96bb9940 Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.808414 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2cbqn" event={"ID":"4814501d-3b55-40bb-b932-41f91ca1d7fb","Type":"ContainerStarted","Data":"63f3bc3c7a3e9ae9344dc85fbe8e3df0db83cb54290f6006f17fe64976a0c746"} Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.813114 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerStarted","Data":"29a5f95506edd88a3900874b31b4f2fd1debe97b135916f4acefaf0c6ec2da85"} Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.815251 4669 generic.go:334] "Generic (PLEG): container finished" podID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" containerID="609b41669b067291745a6bbf95e71d443920b227d4e31771406a5637776767ba" exitCode=0 Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.815309 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s89kf" event={"ID":"6c85d289-ff7f-4b57-a54a-cb272dec58e2","Type":"ContainerDied","Data":"609b41669b067291745a6bbf95e71d443920b227d4e31771406a5637776767ba"} Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.817901 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r7vt" event={"ID":"ee3395f1-0549-4bc4-a145-42ff20c37da6","Type":"ContainerStarted","Data":"4165672b7ed2689527620aa0e5fe0c14b451ea5de820cb08dc80918417dcde21"} Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.820458 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"7e52cf47b1ea2351c50bcd89b78dca4005cb050fc916aa94ef178ab99a189cf3"} Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.823507 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d99769bb4-lq4fx" event={"ID":"85b6fded-ed15-47f3-8e06-23511061f9b1","Type":"ContainerStarted","Data":"a99dca2f974fc03fdf3d57bbdcfc2eb101884bf9ed3d9b12448840ae96bb9940"} Oct 01 11:46:18 crc kubenswrapper[4669]: E1001 11:46:18.824034 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-h6rw6" podUID="db2bb6cb-ab40-4534-967e-c71b62323512" Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.860219 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2cbqn" podStartSLOduration=2.328972351 podStartE2EDuration="43.860188249s" podCreationTimestamp="2025-10-01 11:45:35 +0000 UTC" firstStartedPulling="2025-10-01 11:45:36.611811563 +0000 UTC m=+1027.711376540" lastFinishedPulling="2025-10-01 11:46:18.143027461 +0000 UTC m=+1069.242592438" observedRunningTime="2025-10-01 11:46:18.83030265 +0000 UTC m=+1069.929867627" watchObservedRunningTime="2025-10-01 11:46:18.860188249 +0000 UTC m=+1069.959753226" Oct 01 11:46:18 crc kubenswrapper[4669]: I1001 11:46:18.965944 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8r7vt" podStartSLOduration=2.4769412969999998 podStartE2EDuration="43.965915386s" podCreationTimestamp="2025-10-01 11:45:35 +0000 UTC" firstStartedPulling="2025-10-01 11:45:36.651757837 +0000 UTC m=+1027.751322814" lastFinishedPulling="2025-10-01 11:46:18.140731886 +0000 UTC m=+1069.240296903" observedRunningTime="2025-10-01 11:46:18.955941103 +0000 UTC m=+1070.055506080" watchObservedRunningTime="2025-10-01 11:46:18.965915386 +0000 UTC m=+1070.065480363" Oct 01 11:46:19 crc kubenswrapper[4669]: I1001 11:46:19.837347 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d99769bb4-lq4fx" event={"ID":"85b6fded-ed15-47f3-8e06-23511061f9b1","Type":"ContainerStarted","Data":"62671e5196dc18d7b6bde2f3e460b45fedba897f092b56b194a220e281074321"} Oct 01 11:46:19 crc kubenswrapper[4669]: I1001 11:46:19.838498 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:19 crc kubenswrapper[4669]: I1001 11:46:19.870560 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d99769bb4-lq4fx" podStartSLOduration=2.870527664 podStartE2EDuration="2.870527664s" podCreationTimestamp="2025-10-01 11:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:19.862839257 +0000 UTC m=+1070.962404234" watchObservedRunningTime="2025-10-01 11:46:19.870527664 +0000 UTC m=+1070.970092651" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.463561 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s89kf" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.588492 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-config-data\") pod \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.589009 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvggx\" (UniqueName: \"kubernetes.io/projected/6c85d289-ff7f-4b57-a54a-cb272dec58e2-kube-api-access-kvggx\") pod \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.589124 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-db-sync-config-data\") pod \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.589188 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-combined-ca-bundle\") pod \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\" (UID: \"6c85d289-ff7f-4b57-a54a-cb272dec58e2\") " Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.596995 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c85d289-ff7f-4b57-a54a-cb272dec58e2-kube-api-access-kvggx" (OuterVolumeSpecName: "kube-api-access-kvggx") pod "6c85d289-ff7f-4b57-a54a-cb272dec58e2" (UID: "6c85d289-ff7f-4b57-a54a-cb272dec58e2"). InnerVolumeSpecName "kube-api-access-kvggx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.597434 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6c85d289-ff7f-4b57-a54a-cb272dec58e2" (UID: "6c85d289-ff7f-4b57-a54a-cb272dec58e2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.630158 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c85d289-ff7f-4b57-a54a-cb272dec58e2" (UID: "6c85d289-ff7f-4b57-a54a-cb272dec58e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.643524 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-config-data" (OuterVolumeSpecName: "config-data") pod "6c85d289-ff7f-4b57-a54a-cb272dec58e2" (UID: "6c85d289-ff7f-4b57-a54a-cb272dec58e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.691917 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.691964 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvggx\" (UniqueName: \"kubernetes.io/projected/6c85d289-ff7f-4b57-a54a-cb272dec58e2-kube-api-access-kvggx\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.691979 4669 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.691988 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c85d289-ff7f-4b57-a54a-cb272dec58e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.850158 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s89kf" Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.851446 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s89kf" event={"ID":"6c85d289-ff7f-4b57-a54a-cb272dec58e2","Type":"ContainerDied","Data":"291c9bd23cd59d41cbfd4c61d2e1dd77bbf9286038dfecc6226508fa1be80c33"} Oct 01 11:46:20 crc kubenswrapper[4669]: I1001 11:46:20.851605 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="291c9bd23cd59d41cbfd4c61d2e1dd77bbf9286038dfecc6226508fa1be80c33" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.363728 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-45rqt"] Oct 01 11:46:21 crc kubenswrapper[4669]: E1001 11:46:21.365049 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" containerName="glance-db-sync" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.365201 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" containerName="glance-db-sync" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.365476 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" containerName="glance-db-sync" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.366966 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.401160 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-45rqt"] Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.511396 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-config\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.511461 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4dc\" (UniqueName: \"kubernetes.io/projected/fccc8879-5a26-4644-8cb0-783fe9816cbd-kube-api-access-sf4dc\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.511534 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.511591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.511777 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.511821 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.613982 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.614122 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.614170 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.614198 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.614269 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-config\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.614296 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4dc\" (UniqueName: \"kubernetes.io/projected/fccc8879-5a26-4644-8cb0-783fe9816cbd-kube-api-access-sf4dc\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.615389 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.615438 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.615464 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.615598 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-config\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.616126 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.648173 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4dc\" (UniqueName: \"kubernetes.io/projected/fccc8879-5a26-4644-8cb0-783fe9816cbd-kube-api-access-sf4dc\") pod \"dnsmasq-dns-8b5c85b87-45rqt\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:21 crc kubenswrapper[4669]: I1001 11:46:21.689479 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.220221 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-45rqt"] Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.240517 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.245380 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.254038 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.254163 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.254317 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wkq9x" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.259963 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.347799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.347868 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-logs\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.347902 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmv8\" (UniqueName: \"kubernetes.io/projected/deb3368d-5a59-43f5-93df-5c4c45d00de2-kube-api-access-2wmv8\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.348055 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.348270 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.348307 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-config-data\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.348345 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-scripts\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449550 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmv8\" (UniqueName: \"kubernetes.io/projected/deb3368d-5a59-43f5-93df-5c4c45d00de2-kube-api-access-2wmv8\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449634 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449687 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449708 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-config-data\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449735 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-scripts\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449785 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.449819 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-logs\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.450323 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-logs\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.450612 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.450807 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.464637 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-config-data\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.467509 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.469936 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-scripts\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.471408 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmv8\" (UniqueName: \"kubernetes.io/projected/deb3368d-5a59-43f5-93df-5c4c45d00de2-kube-api-access-2wmv8\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.515058 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.518614 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.523866 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.528492 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.540686 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.648475 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.653699 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.653750 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.653812 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rbvz\" (UniqueName: \"kubernetes.io/projected/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-kube-api-access-5rbvz\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.653837 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.653867 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.653909 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.654196 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.756826 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.756916 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.757010 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.757098 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.757198 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rbvz\" (UniqueName: \"kubernetes.io/projected/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-kube-api-access-5rbvz\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.757237 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.757310 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.758781 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.761625 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.769469 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.771282 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-logs\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.771937 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.782934 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.788930 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rbvz\" (UniqueName: \"kubernetes.io/projected/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-kube-api-access-5rbvz\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.799089 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.848597 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.893654 4669 generic.go:334] "Generic (PLEG): container finished" podID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerID="57b41adf8e2ceacb3b887ba4c80bec7fdf92b3caf8b903c8c10bf4f803f44123" exitCode=0 Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.894089 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" event={"ID":"fccc8879-5a26-4644-8cb0-783fe9816cbd","Type":"ContainerDied","Data":"57b41adf8e2ceacb3b887ba4c80bec7fdf92b3caf8b903c8c10bf4f803f44123"} Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.894194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" event={"ID":"fccc8879-5a26-4644-8cb0-783fe9816cbd","Type":"ContainerStarted","Data":"802f4c2ee0f67f99a78776bd21eee8bfe117a908f2ffe70ba76b697af135848f"} Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.901441 4669 generic.go:334] "Generic (PLEG): container finished" podID="ee3395f1-0549-4bc4-a145-42ff20c37da6" containerID="4165672b7ed2689527620aa0e5fe0c14b451ea5de820cb08dc80918417dcde21" exitCode=0 Oct 01 11:46:22 crc kubenswrapper[4669]: I1001 11:46:22.901494 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r7vt" event={"ID":"ee3395f1-0549-4bc4-a145-42ff20c37da6","Type":"ContainerDied","Data":"4165672b7ed2689527620aa0e5fe0c14b451ea5de820cb08dc80918417dcde21"} Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.280753 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.445830 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.920472 4669 generic.go:334] "Generic (PLEG): container finished" podID="4814501d-3b55-40bb-b932-41f91ca1d7fb" containerID="63f3bc3c7a3e9ae9344dc85fbe8e3df0db83cb54290f6006f17fe64976a0c746" exitCode=0 Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.920570 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2cbqn" event={"ID":"4814501d-3b55-40bb-b932-41f91ca1d7fb","Type":"ContainerDied","Data":"63f3bc3c7a3e9ae9344dc85fbe8e3df0db83cb54290f6006f17fe64976a0c746"} Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.928576 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" event={"ID":"fccc8879-5a26-4644-8cb0-783fe9816cbd","Type":"ContainerStarted","Data":"28370ffbeed8916aefd49454aa508a57a8c4cb3ac5e9233234d654e821ae0130"} Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.928629 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:23 crc kubenswrapper[4669]: I1001 11:46:23.965840 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" podStartSLOduration=2.96581383 podStartE2EDuration="2.96581383s" podCreationTimestamp="2025-10-01 11:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:23.965126894 +0000 UTC m=+1075.064691881" watchObservedRunningTime="2025-10-01 11:46:23.96581383 +0000 UTC m=+1075.065378807" Oct 01 11:46:24 crc kubenswrapper[4669]: I1001 11:46:24.279746 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:24 crc kubenswrapper[4669]: I1001 11:46:24.348439 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:26 crc kubenswrapper[4669]: I1001 11:46:26.366932 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:46:26 crc kubenswrapper[4669]: I1001 11:46:26.371402 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.013233 4669 generic.go:334] "Generic (PLEG): container finished" podID="15da5802-a63f-44a7-b5b2-9f85b62e6675" containerID="0ae6a7126e7381ab1718ccd6b3e2637a6ce2cd81c70734a872755d4a57a58bd9" exitCode=0 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.013325 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nfxsr" event={"ID":"15da5802-a63f-44a7-b5b2-9f85b62e6675","Type":"ContainerDied","Data":"0ae6a7126e7381ab1718ccd6b3e2637a6ce2cd81c70734a872755d4a57a58bd9"} Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.016384 4669 generic.go:334] "Generic (PLEG): container finished" podID="007c4768-f1c7-4750-a403-9a930798b8fb" containerID="77e8fd6f30440d5c23e630e598d5127008ce249f884e507c028aad52ab19fb53" exitCode=137 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.016419 4669 generic.go:334] "Generic (PLEG): container finished" podID="007c4768-f1c7-4750-a403-9a930798b8fb" containerID="878df99570804bbb4ca83d6b364f11a1580d55b0999fc8235a0bdc70ec78ab35" exitCode=137 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.016473 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7697d5fb49-4zsxr" event={"ID":"007c4768-f1c7-4750-a403-9a930798b8fb","Type":"ContainerDied","Data":"77e8fd6f30440d5c23e630e598d5127008ce249f884e507c028aad52ab19fb53"} Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.016504 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7697d5fb49-4zsxr" event={"ID":"007c4768-f1c7-4750-a403-9a930798b8fb","Type":"ContainerDied","Data":"878df99570804bbb4ca83d6b364f11a1580d55b0999fc8235a0bdc70ec78ab35"} Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.019178 4669 generic.go:334] "Generic (PLEG): container finished" podID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerID="f50645d7f8a1075223cdc0cd5c60f3fb85a492b6dcf3c7166e5ef9531213c34c" exitCode=137 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.019223 4669 generic.go:334] "Generic (PLEG): container finished" podID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerID="428d88cb406c771e4b1e4d9dcd129cde4c277cf86afae6378484d7c0dac94377" exitCode=137 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.019246 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c957f569-jgz5q" event={"ID":"4663989c-0e40-4edc-a036-87db51b6dd1f","Type":"ContainerDied","Data":"f50645d7f8a1075223cdc0cd5c60f3fb85a492b6dcf3c7166e5ef9531213c34c"} Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.019275 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c957f569-jgz5q" event={"ID":"4663989c-0e40-4edc-a036-87db51b6dd1f","Type":"ContainerDied","Data":"428d88cb406c771e4b1e4d9dcd129cde4c277cf86afae6378484d7c0dac94377"} Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.022209 4669 generic.go:334] "Generic (PLEG): container finished" podID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerID="d26c4fb93d9265472f91f0f34dad72565a4d975b3d9bd64355d7c695d7486d3d" exitCode=137 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.022234 4669 generic.go:334] "Generic (PLEG): container finished" podID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerID="7589ef84f4911b8fd022050a72b62ec491e1d2d9147045a7f2a7a5b07a4c7a21" exitCode=137 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.022263 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f47dd5fdf-8bs76" event={"ID":"56bc6065-f53f-4531-b18b-d7cab77a717b","Type":"ContainerDied","Data":"d26c4fb93d9265472f91f0f34dad72565a4d975b3d9bd64355d7c695d7486d3d"} Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.022300 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f47dd5fdf-8bs76" event={"ID":"56bc6065-f53f-4531-b18b-d7cab77a717b","Type":"ContainerDied","Data":"7589ef84f4911b8fd022050a72b62ec491e1d2d9147045a7f2a7a5b07a4c7a21"} Oct 01 11:46:27 crc kubenswrapper[4669]: W1001 11:46:27.762931 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb3368d_5a59_43f5_93df_5c4c45d00de2.slice/crio-68ff0909928dcafe9ea63dc6e69106c70bc5ac41389ac2fbb03f9e25131a9e7c WatchSource:0}: Error finding container 68ff0909928dcafe9ea63dc6e69106c70bc5ac41389ac2fbb03f9e25131a9e7c: Status 404 returned error can't find the container with id 68ff0909928dcafe9ea63dc6e69106c70bc5ac41389ac2fbb03f9e25131a9e7c Oct 01 11:46:27 crc kubenswrapper[4669]: W1001 11:46:27.768988 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d32351_4bba_4ea7_b8f1_69442fe50ac3.slice/crio-c6c122a50db014eb27087cfd18562df832fa9e7cca5138de27e74fed4f6139f2 WatchSource:0}: Error finding container c6c122a50db014eb27087cfd18562df832fa9e7cca5138de27e74fed4f6139f2: Status 404 returned error can't find the container with id c6c122a50db014eb27087cfd18562df832fa9e7cca5138de27e74fed4f6139f2 Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.875453 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:46:27 crc kubenswrapper[4669]: I1001 11:46:27.878312 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r7vt" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.002692 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txz9v\" (UniqueName: \"kubernetes.io/projected/4814501d-3b55-40bb-b932-41f91ca1d7fb-kube-api-access-txz9v\") pod \"4814501d-3b55-40bb-b932-41f91ca1d7fb\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.002870 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-config-data\") pod \"ee3395f1-0549-4bc4-a145-42ff20c37da6\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.002936 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cflqb\" (UniqueName: \"kubernetes.io/projected/ee3395f1-0549-4bc4-a145-42ff20c37da6-kube-api-access-cflqb\") pod \"ee3395f1-0549-4bc4-a145-42ff20c37da6\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.003012 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-combined-ca-bundle\") pod \"4814501d-3b55-40bb-b932-41f91ca1d7fb\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.003055 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-combined-ca-bundle\") pod \"ee3395f1-0549-4bc4-a145-42ff20c37da6\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.003097 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-scripts\") pod \"ee3395f1-0549-4bc4-a145-42ff20c37da6\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.003137 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-db-sync-config-data\") pod \"4814501d-3b55-40bb-b932-41f91ca1d7fb\" (UID: \"4814501d-3b55-40bb-b932-41f91ca1d7fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.003187 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee3395f1-0549-4bc4-a145-42ff20c37da6-logs\") pod \"ee3395f1-0549-4bc4-a145-42ff20c37da6\" (UID: \"ee3395f1-0549-4bc4-a145-42ff20c37da6\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.005044 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee3395f1-0549-4bc4-a145-42ff20c37da6-logs" (OuterVolumeSpecName: "logs") pod "ee3395f1-0549-4bc4-a145-42ff20c37da6" (UID: "ee3395f1-0549-4bc4-a145-42ff20c37da6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.016801 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-scripts" (OuterVolumeSpecName: "scripts") pod "ee3395f1-0549-4bc4-a145-42ff20c37da6" (UID: "ee3395f1-0549-4bc4-a145-42ff20c37da6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.019621 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4814501d-3b55-40bb-b932-41f91ca1d7fb-kube-api-access-txz9v" (OuterVolumeSpecName: "kube-api-access-txz9v") pod "4814501d-3b55-40bb-b932-41f91ca1d7fb" (UID: "4814501d-3b55-40bb-b932-41f91ca1d7fb"). InnerVolumeSpecName "kube-api-access-txz9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.020569 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4814501d-3b55-40bb-b932-41f91ca1d7fb" (UID: "4814501d-3b55-40bb-b932-41f91ca1d7fb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.024360 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3395f1-0549-4bc4-a145-42ff20c37da6-kube-api-access-cflqb" (OuterVolumeSpecName: "kube-api-access-cflqb") pod "ee3395f1-0549-4bc4-a145-42ff20c37da6" (UID: "ee3395f1-0549-4bc4-a145-42ff20c37da6"). InnerVolumeSpecName "kube-api-access-cflqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.033323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"deb3368d-5a59-43f5-93df-5c4c45d00de2","Type":"ContainerStarted","Data":"68ff0909928dcafe9ea63dc6e69106c70bc5ac41389ac2fbb03f9e25131a9e7c"} Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.035628 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2cbqn" event={"ID":"4814501d-3b55-40bb-b932-41f91ca1d7fb","Type":"ContainerDied","Data":"00fbcfa86f2762c5fea4a5c1d0dc5fd6cf3611403d5d83d3d60cdf1908e465cb"} Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.035669 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fbcfa86f2762c5fea4a5c1d0dc5fd6cf3611403d5d83d3d60cdf1908e465cb" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.035745 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2cbqn" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.049861 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee3395f1-0549-4bc4-a145-42ff20c37da6" (UID: "ee3395f1-0549-4bc4-a145-42ff20c37da6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.051693 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4814501d-3b55-40bb-b932-41f91ca1d7fb" (UID: "4814501d-3b55-40bb-b932-41f91ca1d7fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.065102 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8r7vt" event={"ID":"ee3395f1-0549-4bc4-a145-42ff20c37da6","Type":"ContainerDied","Data":"ddc34707d1c53d697793fb461c7cb3cda50233e9c42940eaca087dce4f957b9e"} Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.065178 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc34707d1c53d697793fb461c7cb3cda50233e9c42940eaca087dce4f957b9e" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.065275 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8r7vt" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.069656 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8d32351-4bba-4ea7-b8f1-69442fe50ac3","Type":"ContainerStarted","Data":"c6c122a50db014eb27087cfd18562df832fa9e7cca5138de27e74fed4f6139f2"} Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105175 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105215 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105227 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105237 4669 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4814501d-3b55-40bb-b932-41f91ca1d7fb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105247 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee3395f1-0549-4bc4-a145-42ff20c37da6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105256 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txz9v\" (UniqueName: \"kubernetes.io/projected/4814501d-3b55-40bb-b932-41f91ca1d7fb-kube-api-access-txz9v\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.105268 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cflqb\" (UniqueName: \"kubernetes.io/projected/ee3395f1-0549-4bc4-a145-42ff20c37da6-kube-api-access-cflqb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.142304 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-config-data" (OuterVolumeSpecName: "config-data") pod "ee3395f1-0549-4bc4-a145-42ff20c37da6" (UID: "ee3395f1-0549-4bc4-a145-42ff20c37da6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.158801 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.198781 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-74d4dc5744-kqwsh" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.237906 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3395f1-0549-4bc4-a145-42ff20c37da6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.338848 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-866c85f5d8-mvd64"] Oct 01 11:46:28 crc kubenswrapper[4669]: E1001 11:46:28.536253 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4814501d_3b55_40bb_b932_41f91ca1d7fb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee3395f1_0549_4bc4_a145_42ff20c37da6.slice\": RecentStats: unable to find data in memory cache]" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.875616 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.888728 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.898848 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.901894 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952617 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m8cz\" (UniqueName: \"kubernetes.io/projected/56bc6065-f53f-4531-b18b-d7cab77a717b-kube-api-access-5m8cz\") pod \"56bc6065-f53f-4531-b18b-d7cab77a717b\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952732 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007c4768-f1c7-4750-a403-9a930798b8fb-logs\") pod \"007c4768-f1c7-4750-a403-9a930798b8fb\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952822 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/007c4768-f1c7-4750-a403-9a930798b8fb-horizon-secret-key\") pod \"007c4768-f1c7-4750-a403-9a930798b8fb\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952850 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-combined-ca-bundle\") pod \"15da5802-a63f-44a7-b5b2-9f85b62e6675\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952925 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-scripts\") pod \"007c4768-f1c7-4750-a403-9a930798b8fb\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952957 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-config-data\") pod \"56bc6065-f53f-4531-b18b-d7cab77a717b\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.952980 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-scripts\") pod \"56bc6065-f53f-4531-b18b-d7cab77a717b\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953034 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-config\") pod \"15da5802-a63f-44a7-b5b2-9f85b62e6675\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953056 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q527\" (UniqueName: \"kubernetes.io/projected/4663989c-0e40-4edc-a036-87db51b6dd1f-kube-api-access-9q527\") pod \"4663989c-0e40-4edc-a036-87db51b6dd1f\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953101 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-config-data\") pod \"007c4768-f1c7-4750-a403-9a930798b8fb\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953135 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-config-data\") pod \"4663989c-0e40-4edc-a036-87db51b6dd1f\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953175 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtpqt\" (UniqueName: \"kubernetes.io/projected/15da5802-a63f-44a7-b5b2-9f85b62e6675-kube-api-access-gtpqt\") pod \"15da5802-a63f-44a7-b5b2-9f85b62e6675\" (UID: \"15da5802-a63f-44a7-b5b2-9f85b62e6675\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953200 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56bc6065-f53f-4531-b18b-d7cab77a717b-horizon-secret-key\") pod \"56bc6065-f53f-4531-b18b-d7cab77a717b\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953242 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4663989c-0e40-4edc-a036-87db51b6dd1f-logs\") pod \"4663989c-0e40-4edc-a036-87db51b6dd1f\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953304 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bc6065-f53f-4531-b18b-d7cab77a717b-logs\") pod \"56bc6065-f53f-4531-b18b-d7cab77a717b\" (UID: \"56bc6065-f53f-4531-b18b-d7cab77a717b\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953364 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4663989c-0e40-4edc-a036-87db51b6dd1f-horizon-secret-key\") pod \"4663989c-0e40-4edc-a036-87db51b6dd1f\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953395 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nbzf\" (UniqueName: \"kubernetes.io/projected/007c4768-f1c7-4750-a403-9a930798b8fb-kube-api-access-8nbzf\") pod \"007c4768-f1c7-4750-a403-9a930798b8fb\" (UID: \"007c4768-f1c7-4750-a403-9a930798b8fb\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.953417 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-scripts\") pod \"4663989c-0e40-4edc-a036-87db51b6dd1f\" (UID: \"4663989c-0e40-4edc-a036-87db51b6dd1f\") " Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.954774 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007c4768-f1c7-4750-a403-9a930798b8fb-logs" (OuterVolumeSpecName: "logs") pod "007c4768-f1c7-4750-a403-9a930798b8fb" (UID: "007c4768-f1c7-4750-a403-9a930798b8fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.956043 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4663989c-0e40-4edc-a036-87db51b6dd1f-logs" (OuterVolumeSpecName: "logs") pod "4663989c-0e40-4edc-a036-87db51b6dd1f" (UID: "4663989c-0e40-4edc-a036-87db51b6dd1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.980622 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bc6065-f53f-4531-b18b-d7cab77a717b-logs" (OuterVolumeSpecName: "logs") pod "56bc6065-f53f-4531-b18b-d7cab77a717b" (UID: "56bc6065-f53f-4531-b18b-d7cab77a717b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.983873 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4663989c-0e40-4edc-a036-87db51b6dd1f-kube-api-access-9q527" (OuterVolumeSpecName: "kube-api-access-9q527") pod "4663989c-0e40-4edc-a036-87db51b6dd1f" (UID: "4663989c-0e40-4edc-a036-87db51b6dd1f"). InnerVolumeSpecName "kube-api-access-9q527". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.995895 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4663989c-0e40-4edc-a036-87db51b6dd1f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4663989c-0e40-4edc-a036-87db51b6dd1f" (UID: "4663989c-0e40-4edc-a036-87db51b6dd1f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.998238 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56bc6065-f53f-4531-b18b-d7cab77a717b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "56bc6065-f53f-4531-b18b-d7cab77a717b" (UID: "56bc6065-f53f-4531-b18b-d7cab77a717b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.998310 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15da5802-a63f-44a7-b5b2-9f85b62e6675-kube-api-access-gtpqt" (OuterVolumeSpecName: "kube-api-access-gtpqt") pod "15da5802-a63f-44a7-b5b2-9f85b62e6675" (UID: "15da5802-a63f-44a7-b5b2-9f85b62e6675"). InnerVolumeSpecName "kube-api-access-gtpqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:28 crc kubenswrapper[4669]: I1001 11:46:28.998392 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bc6065-f53f-4531-b18b-d7cab77a717b-kube-api-access-5m8cz" (OuterVolumeSpecName: "kube-api-access-5m8cz") pod "56bc6065-f53f-4531-b18b-d7cab77a717b" (UID: "56bc6065-f53f-4531-b18b-d7cab77a717b"). InnerVolumeSpecName "kube-api-access-5m8cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.001278 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007c4768-f1c7-4750-a403-9a930798b8fb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "007c4768-f1c7-4750-a403-9a930798b8fb" (UID: "007c4768-f1c7-4750-a403-9a930798b8fb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.010441 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007c4768-f1c7-4750-a403-9a930798b8fb-kube-api-access-8nbzf" (OuterVolumeSpecName: "kube-api-access-8nbzf") pod "007c4768-f1c7-4750-a403-9a930798b8fb" (UID: "007c4768-f1c7-4750-a403-9a930798b8fb"). InnerVolumeSpecName "kube-api-access-8nbzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.057945 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtpqt\" (UniqueName: \"kubernetes.io/projected/15da5802-a63f-44a7-b5b2-9f85b62e6675-kube-api-access-gtpqt\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.057981 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56bc6065-f53f-4531-b18b-d7cab77a717b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.057991 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4663989c-0e40-4edc-a036-87db51b6dd1f-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058000 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56bc6065-f53f-4531-b18b-d7cab77a717b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058009 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4663989c-0e40-4edc-a036-87db51b6dd1f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058017 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nbzf\" (UniqueName: \"kubernetes.io/projected/007c4768-f1c7-4750-a403-9a930798b8fb-kube-api-access-8nbzf\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058026 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m8cz\" (UniqueName: \"kubernetes.io/projected/56bc6065-f53f-4531-b18b-d7cab77a717b-kube-api-access-5m8cz\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058039 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/007c4768-f1c7-4750-a403-9a930798b8fb-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058048 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/007c4768-f1c7-4750-a403-9a930798b8fb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.058056 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q527\" (UniqueName: \"kubernetes.io/projected/4663989c-0e40-4edc-a036-87db51b6dd1f-kube-api-access-9q527\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.066407 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-scripts" (OuterVolumeSpecName: "scripts") pod "007c4768-f1c7-4750-a403-9a930798b8fb" (UID: "007c4768-f1c7-4750-a403-9a930798b8fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.067350 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-config-data" (OuterVolumeSpecName: "config-data") pod "007c4768-f1c7-4750-a403-9a930798b8fb" (UID: "007c4768-f1c7-4750-a403-9a930798b8fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.083828 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-scripts" (OuterVolumeSpecName: "scripts") pod "56bc6065-f53f-4531-b18b-d7cab77a717b" (UID: "56bc6065-f53f-4531-b18b-d7cab77a717b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.088891 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-scripts" (OuterVolumeSpecName: "scripts") pod "4663989c-0e40-4edc-a036-87db51b6dd1f" (UID: "4663989c-0e40-4edc-a036-87db51b6dd1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.093886 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-config-data" (OuterVolumeSpecName: "config-data") pod "4663989c-0e40-4edc-a036-87db51b6dd1f" (UID: "4663989c-0e40-4edc-a036-87db51b6dd1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.101740 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15da5802-a63f-44a7-b5b2-9f85b62e6675" (UID: "15da5802-a63f-44a7-b5b2-9f85b62e6675"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.116798 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c957f569-jgz5q" event={"ID":"4663989c-0e40-4edc-a036-87db51b6dd1f","Type":"ContainerDied","Data":"63c1bd1e4be46c0062ec1dace4a00584c329ed598adf9c4a3f213a656dd43317"} Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.116872 4669 scope.go:117] "RemoveContainer" containerID="f50645d7f8a1075223cdc0cd5c60f3fb85a492b6dcf3c7166e5ef9531213c34c" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.117048 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c957f569-jgz5q" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.117335 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-config" (OuterVolumeSpecName: "config") pod "15da5802-a63f-44a7-b5b2-9f85b62e6675" (UID: "15da5802-a63f-44a7-b5b2-9f85b62e6675"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.124701 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f47dd5fdf-8bs76" event={"ID":"56bc6065-f53f-4531-b18b-d7cab77a717b","Type":"ContainerDied","Data":"37f945371eeb8ce1cbc151205fa6f24aebe9c3004c85fa507315b7bad99f874d"} Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.125042 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f47dd5fdf-8bs76" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163266 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163303 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163314 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163324 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/15da5802-a63f-44a7-b5b2-9f85b62e6675-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163336 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/007c4768-f1c7-4750-a403-9a930798b8fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163345 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.163353 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4663989c-0e40-4edc-a036-87db51b6dd1f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.197576 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nfxsr" event={"ID":"15da5802-a63f-44a7-b5b2-9f85b62e6675","Type":"ContainerDied","Data":"7200a839fb92a77877c931470b817cc1470627ed6dcafee8c07f0bb29e2076f9"} Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.197637 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7200a839fb92a77877c931470b817cc1470627ed6dcafee8c07f0bb29e2076f9" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.197751 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nfxsr" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.225387 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-795f7c5588-ppc46"] Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226104 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226124 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226135 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15da5802-a63f-44a7-b5b2-9f85b62e6675" containerName="neutron-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226143 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="15da5802-a63f-44a7-b5b2-9f85b62e6675" containerName="neutron-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226161 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226169 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226181 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226195 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226212 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226221 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226236 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814501d-3b55-40bb-b932-41f91ca1d7fb" containerName="barbican-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226245 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814501d-3b55-40bb-b932-41f91ca1d7fb" containerName="barbican-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226277 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226284 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226299 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3395f1-0549-4bc4-a145-42ff20c37da6" containerName="placement-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226311 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3395f1-0549-4bc4-a145-42ff20c37da6" containerName="placement-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: E1001 11:46:29.226344 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226352 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226554 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226567 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226574 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4814501d-3b55-40bb-b932-41f91ca1d7fb" containerName="barbican-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226587 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226602 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3395f1-0549-4bc4-a145-42ff20c37da6" containerName="placement-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226615 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="15da5802-a63f-44a7-b5b2-9f85b62e6675" containerName="neutron-db-sync" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226626 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226636 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon-log" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.226646 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" containerName="horizon" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.227756 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.264491 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-866c85f5d8-mvd64" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon-log" containerID="cri-o://6836b5a3a12e44f349fe24052dfc7816b67cc72171ee1b8dc050d26ad2b5f3bc" gracePeriod=30 Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.264935 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-config-data" (OuterVolumeSpecName: "config-data") pod "56bc6065-f53f-4531-b18b-d7cab77a717b" (UID: "56bc6065-f53f-4531-b18b-d7cab77a717b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.264963 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7697d5fb49-4zsxr" event={"ID":"007c4768-f1c7-4750-a403-9a930798b8fb","Type":"ContainerDied","Data":"7b89d2b6f4d8e2f0c39ac1f29dc0982fdc597d3af1881bc789d967d6287bab0f"} Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.265015 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7697d5fb49-4zsxr" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.265062 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-866c85f5d8-mvd64" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" containerID="cri-o://e291c34c35b2e8ea3b830f371585d97db07df75e845acb5850aa9ed5690727d9" gracePeriod=30 Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272633 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-public-tls-certs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272692 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-internal-tls-certs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272719 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thxbw\" (UniqueName: \"kubernetes.io/projected/419df7bd-f554-4888-8a51-e885964ada7e-kube-api-access-thxbw\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272738 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-scripts\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272805 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-combined-ca-bundle\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272854 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-config-data\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272921 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/419df7bd-f554-4888-8a51-e885964ada7e-logs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.272981 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56bc6065-f53f-4531-b18b-d7cab77a717b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.273010 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.273174 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mpsgs" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.273407 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.273508 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.276198 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.297161 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-795f7c5588-ppc46"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.341302 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-84b6d46dff-gdp9m"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.359126 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.365533 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.365871 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xj549" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.367050 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.372486 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b7c87b994-mshrj"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.373797 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375617 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-config-data-custom\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375697 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/419df7bd-f554-4888-8a51-e885964ada7e-logs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375778 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-public-tls-certs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375814 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-internal-tls-certs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375843 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thxbw\" (UniqueName: \"kubernetes.io/projected/419df7bd-f554-4888-8a51-e885964ada7e-kube-api-access-thxbw\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375875 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-scripts\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375932 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbr2\" (UniqueName: \"kubernetes.io/projected/c2f34b06-3e5b-4380-8b38-4c9be553dc00-kube-api-access-2fbr2\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375966 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-combined-ca-bundle\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.375993 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-config-data\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.376013 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-combined-ca-bundle\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.376049 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-config-data\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.376113 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2f34b06-3e5b-4380-8b38-4c9be553dc00-logs\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.376392 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.376465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/419df7bd-f554-4888-8a51-e885964ada7e-logs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.387785 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-combined-ca-bundle\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.390558 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-internal-tls-certs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.390630 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-public-tls-certs\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.391180 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84b6d46dff-gdp9m"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.399417 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-config-data\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.405328 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/419df7bd-f554-4888-8a51-e885964ada7e-scripts\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.411645 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b7c87b994-mshrj"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.426797 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thxbw\" (UniqueName: \"kubernetes.io/projected/419df7bd-f554-4888-8a51-e885964ada7e-kube-api-access-thxbw\") pod \"placement-795f7c5588-ppc46\" (UID: \"419df7bd-f554-4888-8a51-e885964ada7e\") " pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.436066 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-45rqt"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.436387 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerName="dnsmasq-dns" containerID="cri-o://28370ffbeed8916aefd49454aa508a57a8c4cb3ac5e9233234d654e821ae0130" gracePeriod=10 Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.447915 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.453274 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c957f569-jgz5q"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.461511 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55c957f569-jgz5q"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.488651 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rxmmm"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.503966 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.529822 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2f34b06-3e5b-4380-8b38-4c9be553dc00-logs\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.529935 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-config-data-custom\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.530444 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbr2\" (UniqueName: \"kubernetes.io/projected/c2f34b06-3e5b-4380-8b38-4c9be553dc00-kube-api-access-2fbr2\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.530592 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-config-data\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.530672 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-combined-ca-bundle\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.532992 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2f34b06-3e5b-4380-8b38-4c9be553dc00-logs\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.543009 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-combined-ca-bundle\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.582695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-config-data\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.606703 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbr2\" (UniqueName: \"kubernetes.io/projected/c2f34b06-3e5b-4380-8b38-4c9be553dc00-kube-api-access-2fbr2\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.663976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-config-data-custom\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.687010 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mpsgs" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.687857 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.730714 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-config-data\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.730811 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.730883 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731104 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731183 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-combined-ca-bundle\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731259 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnlhk\" (UniqueName: \"kubernetes.io/projected/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-kube-api-access-jnlhk\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731303 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vg4\" (UniqueName: \"kubernetes.io/projected/14df8713-8fa5-482c-9280-af169783618d-kube-api-access-v9vg4\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731442 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-config\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731529 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14df8713-8fa5-482c-9280-af169783618d-logs\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.731595 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.754000 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f34b06-3e5b-4380-8b38-4c9be553dc00-config-data-custom\") pod \"barbican-worker-84b6d46dff-gdp9m\" (UID: \"c2f34b06-3e5b-4380-8b38-4c9be553dc00\") " pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.867280 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xj549" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.870525 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-config\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.870587 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14df8713-8fa5-482c-9280-af169783618d-logs\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.870611 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84b6d46dff-gdp9m" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.871393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.871920 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-config\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.870620 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.872802 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-config-data-custom\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.872884 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-config-data\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.872941 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.872962 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.873026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.873060 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-combined-ca-bundle\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.873268 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnlhk\" (UniqueName: \"kubernetes.io/projected/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-kube-api-access-jnlhk\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.873290 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vg4\" (UniqueName: \"kubernetes.io/projected/14df8713-8fa5-482c-9280-af169783618d-kube-api-access-v9vg4\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.874452 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.874991 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.876245 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.881541 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.888485 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14df8713-8fa5-482c-9280-af169783618d-logs\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.910010 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-config-data-custom\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.912402 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-config-data\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.913819 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14df8713-8fa5-482c-9280-af169783618d-combined-ca-bundle\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.917236 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnlhk\" (UniqueName: \"kubernetes.io/projected/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-kube-api-access-jnlhk\") pod \"dnsmasq-dns-75c8ddd69c-rxmmm\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.917548 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vg4\" (UniqueName: \"kubernetes.io/projected/14df8713-8fa5-482c-9280-af169783618d-kube-api-access-v9vg4\") pod \"barbican-keystone-listener-5b7c87b994-mshrj\" (UID: \"14df8713-8fa5-482c-9280-af169783618d\") " pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.941455 4669 scope.go:117] "RemoveContainer" containerID="428d88cb406c771e4b1e4d9dcd129cde4c277cf86afae6378484d7c0dac94377" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.943970 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4663989c-0e40-4edc-a036-87db51b6dd1f" path="/var/lib/kubelet/pods/4663989c-0e40-4edc-a036-87db51b6dd1f/volumes" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.944749 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rxmmm"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.944789 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fbb698fb8-vwrw5"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.946295 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fbb868f5d-l9gnp"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.947438 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.949217 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fbb698fb8-vwrw5"] Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.949689 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.952386 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-fwtz2" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.953285 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.953870 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.954269 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.955926 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976107 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976170 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-combined-ca-bundle\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976250 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-httpd-config\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976274 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-ovndb-tls-certs\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976317 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b949b8-5f5e-4f46-836c-7be0991a67d9-logs\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976339 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-config\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976374 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfbv5\" (UniqueName: \"kubernetes.io/projected/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-kube-api-access-tfbv5\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976397 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data-custom\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976417 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-combined-ca-bundle\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.976450 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2qbd\" (UniqueName: \"kubernetes.io/projected/e4b949b8-5f5e-4f46-836c-7be0991a67d9-kube-api-access-h2qbd\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:29 crc kubenswrapper[4669]: I1001 11:46:29.987864 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbb868f5d-l9gnp"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.005301 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7697d5fb49-4zsxr"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.020247 4669 scope.go:117] "RemoveContainer" containerID="d26c4fb93d9265472f91f0f34dad72565a4d975b3d9bd64355d7c695d7486d3d" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.025871 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7697d5fb49-4zsxr"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.047654 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f47dd5fdf-8bs76"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.056129 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f47dd5fdf-8bs76"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.077829 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-httpd-config\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.077898 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-ovndb-tls-certs\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078089 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b949b8-5f5e-4f46-836c-7be0991a67d9-logs\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-config\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078156 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfbv5\" (UniqueName: \"kubernetes.io/projected/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-kube-api-access-tfbv5\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078177 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data-custom\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078197 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-combined-ca-bundle\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078229 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2qbd\" (UniqueName: \"kubernetes.io/projected/e4b949b8-5f5e-4f46-836c-7be0991a67d9-kube-api-access-h2qbd\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078256 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.078283 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-combined-ca-bundle\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.083214 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b949b8-5f5e-4f46-836c-7be0991a67d9-logs\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.086485 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-combined-ca-bundle\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.088220 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-ovndb-tls-certs\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.092885 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-combined-ca-bundle\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.098052 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.099892 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data-custom\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.100393 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-config\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.100732 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-httpd-config\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.101018 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfbv5\" (UniqueName: \"kubernetes.io/projected/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-kube-api-access-tfbv5\") pod \"neutron-6fbb698fb8-vwrw5\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.126723 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2qbd\" (UniqueName: \"kubernetes.io/projected/e4b949b8-5f5e-4f46-836c-7be0991a67d9-kube-api-access-h2qbd\") pod \"barbican-api-5fbb868f5d-l9gnp\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.167276 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.176118 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.252359 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.302720 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.324601 4669 generic.go:334] "Generic (PLEG): container finished" podID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerID="28370ffbeed8916aefd49454aa508a57a8c4cb3ac5e9233234d654e821ae0130" exitCode=0 Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.324686 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" event={"ID":"fccc8879-5a26-4644-8cb0-783fe9816cbd","Type":"ContainerDied","Data":"28370ffbeed8916aefd49454aa508a57a8c4cb3ac5e9233234d654e821ae0130"} Oct 01 11:46:30 crc kubenswrapper[4669]: E1001 11:46:30.392948 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.542842 4669 scope.go:117] "RemoveContainer" containerID="7589ef84f4911b8fd022050a72b62ec491e1d2d9147045a7f2a7a5b07a4c7a21" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.567437 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.641676 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-795f7c5588-ppc46"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.654421 4669 scope.go:117] "RemoveContainer" containerID="77e8fd6f30440d5c23e630e598d5127008ce249f884e507c028aad52ab19fb53" Oct 01 11:46:30 crc kubenswrapper[4669]: W1001 11:46:30.668205 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod419df7bd_f554_4888_8a51_e885964ada7e.slice/crio-fa9819ac8d738f065dfeaaa7ed8a7489a6efe6fa6f09e610f96cdf416f1fb005 WatchSource:0}: Error finding container fa9819ac8d738f065dfeaaa7ed8a7489a6efe6fa6f09e610f96cdf416f1fb005: Status 404 returned error can't find the container with id fa9819ac8d738f065dfeaaa7ed8a7489a6efe6fa6f09e610f96cdf416f1fb005 Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.693197 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-nb\") pod \"fccc8879-5a26-4644-8cb0-783fe9816cbd\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.693871 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-svc\") pod \"fccc8879-5a26-4644-8cb0-783fe9816cbd\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.694139 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-config\") pod \"fccc8879-5a26-4644-8cb0-783fe9816cbd\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.694243 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf4dc\" (UniqueName: \"kubernetes.io/projected/fccc8879-5a26-4644-8cb0-783fe9816cbd-kube-api-access-sf4dc\") pod \"fccc8879-5a26-4644-8cb0-783fe9816cbd\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.694262 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-sb\") pod \"fccc8879-5a26-4644-8cb0-783fe9816cbd\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.694293 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-swift-storage-0\") pod \"fccc8879-5a26-4644-8cb0-783fe9816cbd\" (UID: \"fccc8879-5a26-4644-8cb0-783fe9816cbd\") " Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.708529 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fccc8879-5a26-4644-8cb0-783fe9816cbd-kube-api-access-sf4dc" (OuterVolumeSpecName: "kube-api-access-sf4dc") pod "fccc8879-5a26-4644-8cb0-783fe9816cbd" (UID: "fccc8879-5a26-4644-8cb0-783fe9816cbd"). InnerVolumeSpecName "kube-api-access-sf4dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.779887 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fccc8879-5a26-4644-8cb0-783fe9816cbd" (UID: "fccc8879-5a26-4644-8cb0-783fe9816cbd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.781427 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fccc8879-5a26-4644-8cb0-783fe9816cbd" (UID: "fccc8879-5a26-4644-8cb0-783fe9816cbd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.785231 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b7c87b994-mshrj"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.795749 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84b6d46dff-gdp9m"] Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.797746 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf4dc\" (UniqueName: \"kubernetes.io/projected/fccc8879-5a26-4644-8cb0-783fe9816cbd-kube-api-access-sf4dc\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.797802 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.797820 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.810575 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fccc8879-5a26-4644-8cb0-783fe9816cbd" (UID: "fccc8879-5a26-4644-8cb0-783fe9816cbd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.838486 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-config" (OuterVolumeSpecName: "config") pod "fccc8879-5a26-4644-8cb0-783fe9816cbd" (UID: "fccc8879-5a26-4644-8cb0-783fe9816cbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.841591 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fccc8879-5a26-4644-8cb0-783fe9816cbd" (UID: "fccc8879-5a26-4644-8cb0-783fe9816cbd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.908805 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.908844 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.908857 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fccc8879-5a26-4644-8cb0-783fe9816cbd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:30 crc kubenswrapper[4669]: I1001 11:46:30.928029 4669 scope.go:117] "RemoveContainer" containerID="878df99570804bbb4ca83d6b364f11a1580d55b0999fc8235a0bdc70ec78ab35" Oct 01 11:46:30 crc kubenswrapper[4669]: W1001 11:46:30.937162 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2f34b06_3e5b_4380_8b38_4c9be553dc00.slice/crio-ff3ce2ea1f28bd7412c699fbbd378837f191063bccc465cb4add1ceb007d615e WatchSource:0}: Error finding container ff3ce2ea1f28bd7412c699fbbd378837f191063bccc465cb4add1ceb007d615e: Status 404 returned error can't find the container with id ff3ce2ea1f28bd7412c699fbbd378837f191063bccc465cb4add1ceb007d615e Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.119791 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rxmmm"] Oct 01 11:46:31 crc kubenswrapper[4669]: W1001 11:46:31.126566 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode830daa9_ce44_48e3_8a0e_61a51a58b2b2.slice/crio-bd4223d7730a3ed11b849bc9f50b65250e0309027e2e72c4d9db3c2ae7410b7f WatchSource:0}: Error finding container bd4223d7730a3ed11b849bc9f50b65250e0309027e2e72c4d9db3c2ae7410b7f: Status 404 returned error can't find the container with id bd4223d7730a3ed11b849bc9f50b65250e0309027e2e72c4d9db3c2ae7410b7f Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.333304 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbb868f5d-l9gnp"] Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.373426 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fbb698fb8-vwrw5"] Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.386812 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8d32351-4bba-4ea7-b8f1-69442fe50ac3","Type":"ContainerStarted","Data":"e73cfd3b900b9c3d2c5c3300cc74cbb9d6530fb00c558b80d22b1835b56d0c45"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.397965 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" event={"ID":"14df8713-8fa5-482c-9280-af169783618d","Type":"ContainerStarted","Data":"87494806651e256b46f5cca0ea953556f826b32a69881c89cf80467e78af178e"} Oct 01 11:46:31 crc kubenswrapper[4669]: W1001 11:46:31.410525 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda701bc8f_fd3f_43ed_9b7a_bbb3696dc598.slice/crio-f234e6191799de49d8c81fcd839a29f3d69eedbbe274c05da476b574e5fdcda3 WatchSource:0}: Error finding container f234e6191799de49d8c81fcd839a29f3d69eedbbe274c05da476b574e5fdcda3: Status 404 returned error can't find the container with id f234e6191799de49d8c81fcd839a29f3d69eedbbe274c05da476b574e5fdcda3 Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.444185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-795f7c5588-ppc46" event={"ID":"419df7bd-f554-4888-8a51-e885964ada7e","Type":"ContainerStarted","Data":"fa9819ac8d738f065dfeaaa7ed8a7489a6efe6fa6f09e610f96cdf416f1fb005"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.455123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerStarted","Data":"7a42f86760e35df717ae00f2122bb68d51f1215918edfb0c0c8a2909da3abae8"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.455391 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="ceilometer-notification-agent" containerID="cri-o://665f62049f18fe7951920d0471c64261f10054396f5f92b033f79ab723ab1d3c" gracePeriod=30 Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.455523 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.456059 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="proxy-httpd" containerID="cri-o://7a42f86760e35df717ae00f2122bb68d51f1215918edfb0c0c8a2909da3abae8" gracePeriod=30 Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.456164 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="sg-core" containerID="cri-o://29a5f95506edd88a3900874b31b4f2fd1debe97b135916f4acefaf0c6ec2da85" gracePeriod=30 Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.496211 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" event={"ID":"fccc8879-5a26-4644-8cb0-783fe9816cbd","Type":"ContainerDied","Data":"802f4c2ee0f67f99a78776bd21eee8bfe117a908f2ffe70ba76b697af135848f"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.496692 4669 scope.go:117] "RemoveContainer" containerID="28370ffbeed8916aefd49454aa508a57a8c4cb3ac5e9233234d654e821ae0130" Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.496847 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-45rqt" Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.505028 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" event={"ID":"e830daa9-ce44-48e3-8a0e-61a51a58b2b2","Type":"ContainerStarted","Data":"bd4223d7730a3ed11b849bc9f50b65250e0309027e2e72c4d9db3c2ae7410b7f"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.510528 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b6d46dff-gdp9m" event={"ID":"c2f34b06-3e5b-4380-8b38-4c9be553dc00","Type":"ContainerStarted","Data":"ff3ce2ea1f28bd7412c699fbbd378837f191063bccc465cb4add1ceb007d615e"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.516846 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"deb3368d-5a59-43f5-93df-5c4c45d00de2","Type":"ContainerStarted","Data":"7580566c60218d033d23c8abcfd189bbc5a2fe0f47019565fbfe1ef84051736c"} Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.537880 4669 scope.go:117] "RemoveContainer" containerID="57b41adf8e2ceacb3b887ba4c80bec7fdf92b3caf8b903c8c10bf4f803f44123" Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.630342 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-45rqt"] Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.638953 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-45rqt"] Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.679762 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007c4768-f1c7-4750-a403-9a930798b8fb" path="/var/lib/kubelet/pods/007c4768-f1c7-4750-a403-9a930798b8fb/volumes" Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.680570 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bc6065-f53f-4531-b18b-d7cab77a717b" path="/var/lib/kubelet/pods/56bc6065-f53f-4531-b18b-d7cab77a717b/volumes" Oct 01 11:46:31 crc kubenswrapper[4669]: I1001 11:46:31.681185 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" path="/var/lib/kubelet/pods/fccc8879-5a26-4644-8cb0-783fe9816cbd/volumes" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.333966 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75fdb4d7c7-7ltfb"] Oct 01 11:46:32 crc kubenswrapper[4669]: E1001 11:46:32.337895 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerName="dnsmasq-dns" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.337916 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerName="dnsmasq-dns" Oct 01 11:46:32 crc kubenswrapper[4669]: E1001 11:46:32.337951 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerName="init" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.337959 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerName="init" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.338167 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fccc8879-5a26-4644-8cb0-783fe9816cbd" containerName="dnsmasq-dns" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.339836 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.352438 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.352783 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75fdb4d7c7-7ltfb"] Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.354215 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465051 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-httpd-config\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465117 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-internal-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465141 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-combined-ca-bundle\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465165 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-ovndb-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465290 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-public-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465322 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-config\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.465681 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25p4l\" (UniqueName: \"kubernetes.io/projected/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-kube-api-access-25p4l\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.538513 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"deb3368d-5a59-43f5-93df-5c4c45d00de2","Type":"ContainerStarted","Data":"8d55813d7e23902ec2a7f34e519f0fca2e7fbfd950e6d6e249920886ed0076de"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.538740 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-log" containerID="cri-o://7580566c60218d033d23c8abcfd189bbc5a2fe0f47019565fbfe1ef84051736c" gracePeriod=30 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.539748 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-httpd" containerID="cri-o://8d55813d7e23902ec2a7f34e519f0fca2e7fbfd950e6d6e249920886ed0076de" gracePeriod=30 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.555927 4669 generic.go:334] "Generic (PLEG): container finished" podID="e1ba96b9-d556-419e-a8a3-f90348499977" containerID="7a42f86760e35df717ae00f2122bb68d51f1215918edfb0c0c8a2909da3abae8" exitCode=0 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.555989 4669 generic.go:334] "Generic (PLEG): container finished" podID="e1ba96b9-d556-419e-a8a3-f90348499977" containerID="29a5f95506edd88a3900874b31b4f2fd1debe97b135916f4acefaf0c6ec2da85" exitCode=2 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.556050 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerDied","Data":"7a42f86760e35df717ae00f2122bb68d51f1215918edfb0c0c8a2909da3abae8"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.556134 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerDied","Data":"29a5f95506edd88a3900874b31b4f2fd1debe97b135916f4acefaf0c6ec2da85"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.568011 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-public-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.570676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbb698fb8-vwrw5" event={"ID":"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598","Type":"ContainerStarted","Data":"1b00fbdad665789e469be97345430a77ebd335bc428a8755e3634ea5c13ee394"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.570741 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbb698fb8-vwrw5" event={"ID":"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598","Type":"ContainerStarted","Data":"9c24df43128d70950e52a0798d7495261e93ff9bcb648818e9e80f49bb11938d"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.570754 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbb698fb8-vwrw5" event={"ID":"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598","Type":"ContainerStarted","Data":"f234e6191799de49d8c81fcd839a29f3d69eedbbe274c05da476b574e5fdcda3"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.572450 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.577646 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.577614624 podStartE2EDuration="11.577614624s" podCreationTimestamp="2025-10-01 11:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:32.565699003 +0000 UTC m=+1083.665263980" watchObservedRunningTime="2025-10-01 11:46:32.577614624 +0000 UTC m=+1083.677179601" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.578149 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-config\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.578497 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25p4l\" (UniqueName: \"kubernetes.io/projected/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-kube-api-access-25p4l\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.578681 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-httpd-config\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.588829 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-internal-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.588898 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-combined-ca-bundle\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.588961 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-ovndb-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.593820 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-log" containerID="cri-o://e73cfd3b900b9c3d2c5c3300cc74cbb9d6530fb00c558b80d22b1835b56d0c45" gracePeriod=30 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.594093 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-httpd-config\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.594189 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8d32351-4bba-4ea7-b8f1-69442fe50ac3","Type":"ContainerStarted","Data":"7897b5906b2427beeaf96977db8c85ee03f33d6d1cfc1909fa0668c23e608297"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.594271 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-httpd" containerID="cri-o://7897b5906b2427beeaf96977db8c85ee03f33d6d1cfc1909fa0668c23e608297" gracePeriod=30 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.598898 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbb868f5d-l9gnp" event={"ID":"e4b949b8-5f5e-4f46-836c-7be0991a67d9","Type":"ContainerStarted","Data":"c2add750e836c213a247c06abd2f27660764e73057e068ea011873a044d4f9c0"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.598954 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbb868f5d-l9gnp" event={"ID":"e4b949b8-5f5e-4f46-836c-7be0991a67d9","Type":"ContainerStarted","Data":"372a8648a796d20fe068b4ce1c4d817639caf884cb78f9712e2eab7b93a5788f"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.598968 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbb868f5d-l9gnp" event={"ID":"e4b949b8-5f5e-4f46-836c-7be0991a67d9","Type":"ContainerStarted","Data":"4b73f7d0a3f16ee82a4eeac8fc7804552571c8ac7348ec15ea03dbc63bf9c468"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.601339 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.601413 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.604001 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-795f7c5588-ppc46" event={"ID":"419df7bd-f554-4888-8a51-e885964ada7e","Type":"ContainerStarted","Data":"d6a38eed80aeffc462542684b99308cf337d72bde531ede56b8854d6f800a56e"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.604053 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-795f7c5588-ppc46" event={"ID":"419df7bd-f554-4888-8a51-e885964ada7e","Type":"ContainerStarted","Data":"a47924e4bc074564a7897bcded7bf2441dfd66bff2af85d8723734c33a6c9edb"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.604999 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.605124 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.606473 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-config\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.606900 4669 generic.go:334] "Generic (PLEG): container finished" podID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerID="fe4bfdafde10c16962a1f295de5f5ead419963d2b6c9c7ec5abc8f8f14f08c61" exitCode=0 Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.606931 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" event={"ID":"e830daa9-ce44-48e3-8a0e-61a51a58b2b2","Type":"ContainerDied","Data":"fe4bfdafde10c16962a1f295de5f5ead419963d2b6c9c7ec5abc8f8f14f08c61"} Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.610027 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-ovndb-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.621177 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-internal-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.622045 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-combined-ca-bundle\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.622391 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25p4l\" (UniqueName: \"kubernetes.io/projected/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-kube-api-access-25p4l\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.627663 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fbb698fb8-vwrw5" podStartSLOduration=3.627628803 podStartE2EDuration="3.627628803s" podCreationTimestamp="2025-10-01 11:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:32.621647217 +0000 UTC m=+1083.721212194" watchObservedRunningTime="2025-10-01 11:46:32.627628803 +0000 UTC m=+1083.727193780" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.629024 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74d7e57e-eda0-4134-bfd3-ed2c0e4826bf-public-tls-certs\") pod \"neutron-75fdb4d7c7-7ltfb\" (UID: \"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf\") " pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.656069 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-795f7c5588-ppc46" podStartSLOduration=3.656044247 podStartE2EDuration="3.656044247s" podCreationTimestamp="2025-10-01 11:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:32.648040151 +0000 UTC m=+1083.747605118" watchObservedRunningTime="2025-10-01 11:46:32.656044247 +0000 UTC m=+1083.755609224" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.670406 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.722058 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fbb868f5d-l9gnp" podStartSLOduration=3.722034436 podStartE2EDuration="3.722034436s" podCreationTimestamp="2025-10-01 11:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:32.697531778 +0000 UTC m=+1083.797096755" watchObservedRunningTime="2025-10-01 11:46:32.722034436 +0000 UTC m=+1083.821599413" Oct 01 11:46:32 crc kubenswrapper[4669]: I1001 11:46:32.733163 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.733148127 podStartE2EDuration="11.733148127s" podCreationTimestamp="2025-10-01 11:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:32.731373214 +0000 UTC m=+1083.830938201" watchObservedRunningTime="2025-10-01 11:46:32.733148127 +0000 UTC m=+1083.832713194" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.624273 4669 generic.go:334] "Generic (PLEG): container finished" podID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerID="8d55813d7e23902ec2a7f34e519f0fca2e7fbfd950e6d6e249920886ed0076de" exitCode=0 Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.624713 4669 generic.go:334] "Generic (PLEG): container finished" podID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerID="7580566c60218d033d23c8abcfd189bbc5a2fe0f47019565fbfe1ef84051736c" exitCode=143 Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.624365 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"deb3368d-5a59-43f5-93df-5c4c45d00de2","Type":"ContainerDied","Data":"8d55813d7e23902ec2a7f34e519f0fca2e7fbfd950e6d6e249920886ed0076de"} Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.624797 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"deb3368d-5a59-43f5-93df-5c4c45d00de2","Type":"ContainerDied","Data":"7580566c60218d033d23c8abcfd189bbc5a2fe0f47019565fbfe1ef84051736c"} Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.629960 4669 generic.go:334] "Generic (PLEG): container finished" podID="e1ba96b9-d556-419e-a8a3-f90348499977" containerID="665f62049f18fe7951920d0471c64261f10054396f5f92b033f79ab723ab1d3c" exitCode=0 Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.630048 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerDied","Data":"665f62049f18fe7951920d0471c64261f10054396f5f92b033f79ab723ab1d3c"} Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.633020 4669 generic.go:334] "Generic (PLEG): container finished" podID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerID="7897b5906b2427beeaf96977db8c85ee03f33d6d1cfc1909fa0668c23e608297" exitCode=0 Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.633054 4669 generic.go:334] "Generic (PLEG): container finished" podID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerID="e73cfd3b900b9c3d2c5c3300cc74cbb9d6530fb00c558b80d22b1835b56d0c45" exitCode=143 Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.633101 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8d32351-4bba-4ea7-b8f1-69442fe50ac3","Type":"ContainerDied","Data":"7897b5906b2427beeaf96977db8c85ee03f33d6d1cfc1909fa0668c23e608297"} Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.633146 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8d32351-4bba-4ea7-b8f1-69442fe50ac3","Type":"ContainerDied","Data":"e73cfd3b900b9c3d2c5c3300cc74cbb9d6530fb00c558b80d22b1835b56d0c45"} Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.635803 4669 generic.go:334] "Generic (PLEG): container finished" podID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerID="e291c34c35b2e8ea3b830f371585d97db07df75e845acb5850aa9ed5690727d9" exitCode=0 Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.637013 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c85f5d8-mvd64" event={"ID":"62dab5a8-a8e3-4496-8187-089069b8e14f","Type":"ContainerDied","Data":"e291c34c35b2e8ea3b830f371585d97db07df75e845acb5850aa9ed5690727d9"} Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.817310 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.870898 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-combined-ca-bundle\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.870977 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjmhm\" (UniqueName: \"kubernetes.io/projected/e1ba96b9-d556-419e-a8a3-f90348499977-kube-api-access-jjmhm\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.871113 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-run-httpd\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.871139 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-config-data\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.871205 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-scripts\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.871221 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-sg-core-conf-yaml\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.871317 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-log-httpd\") pod \"e1ba96b9-d556-419e-a8a3-f90348499977\" (UID: \"e1ba96b9-d556-419e-a8a3-f90348499977\") " Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.871967 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.872474 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.878109 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-scripts" (OuterVolumeSpecName: "scripts") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.882653 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ba96b9-d556-419e-a8a3-f90348499977-kube-api-access-jjmhm" (OuterVolumeSpecName: "kube-api-access-jjmhm") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "kube-api-access-jjmhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.918228 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.977168 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.977210 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjmhm\" (UniqueName: \"kubernetes.io/projected/e1ba96b9-d556-419e-a8a3-f90348499977-kube-api-access-jjmhm\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.977244 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1ba96b9-d556-419e-a8a3-f90348499977-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.977256 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:33 crc kubenswrapper[4669]: I1001 11:46:33.977264 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.065506 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.082315 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.109211 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-config-data" (OuterVolumeSpecName: "config-data") pod "e1ba96b9-d556-419e-a8a3-f90348499977" (UID: "e1ba96b9-d556-419e-a8a3-f90348499977"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.181655 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rbvz\" (UniqueName: \"kubernetes.io/projected/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-kube-api-access-5rbvz\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.181875 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-config-data\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.181940 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-httpd-run\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.181958 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-logs\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.182102 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.182899 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-logs" (OuterVolumeSpecName: "logs") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.183290 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.183816 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-combined-ca-bundle\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.183982 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-scripts\") pod \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\" (UID: \"b8d32351-4bba-4ea7-b8f1-69442fe50ac3\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.186136 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.186163 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.186201 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.186238 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba96b9-d556-419e-a8a3-f90348499977-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.189465 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.190815 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-scripts" (OuterVolumeSpecName: "scripts") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.193397 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-kube-api-access-5rbvz" (OuterVolumeSpecName: "kube-api-access-5rbvz") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "kube-api-access-5rbvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.203516 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.250478 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.271575 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-config-data" (OuterVolumeSpecName: "config-data") pod "b8d32351-4bba-4ea7-b8f1-69442fe50ac3" (UID: "b8d32351-4bba-4ea7-b8f1-69442fe50ac3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.287904 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.288201 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-config-data\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.288270 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wmv8\" (UniqueName: \"kubernetes.io/projected/deb3368d-5a59-43f5-93df-5c4c45d00de2-kube-api-access-2wmv8\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.288319 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.288440 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-logs\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.288505 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-httpd-run\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.288535 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-scripts\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.289044 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.289061 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rbvz\" (UniqueName: \"kubernetes.io/projected/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-kube-api-access-5rbvz\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.289087 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.289119 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.289130 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d32351-4bba-4ea7-b8f1-69442fe50ac3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.289865 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.290147 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-logs" (OuterVolumeSpecName: "logs") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.299090 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-scripts" (OuterVolumeSpecName: "scripts") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.300114 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.302723 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb3368d-5a59-43f5-93df-5c4c45d00de2-kube-api-access-2wmv8" (OuterVolumeSpecName: "kube-api-access-2wmv8") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "kube-api-access-2wmv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.336125 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.362312 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-866c85f5d8-mvd64" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.392803 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.396126 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle\") pod \"deb3368d-5a59-43f5-93df-5c4c45d00de2\" (UID: \"deb3368d-5a59-43f5-93df-5c4c45d00de2\") " Oct 01 11:46:34 crc kubenswrapper[4669]: W1001 11:46:34.396560 4669 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/deb3368d-5a59-43f5-93df-5c4c45d00de2/volumes/kubernetes.io~secret/combined-ca-bundle Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.396620 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410058 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410114 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wmv8\" (UniqueName: \"kubernetes.io/projected/deb3368d-5a59-43f5-93df-5c4c45d00de2-kube-api-access-2wmv8\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410352 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410364 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410376 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410404 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/deb3368d-5a59-43f5-93df-5c4c45d00de2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.410417 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.422192 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-config-data" (OuterVolumeSpecName: "config-data") pod "deb3368d-5a59-43f5-93df-5c4c45d00de2" (UID: "deb3368d-5a59-43f5-93df-5c4c45d00de2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.437390 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.500966 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75fdb4d7c7-7ltfb"] Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.515625 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb3368d-5a59-43f5-93df-5c4c45d00de2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.515682 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:34 crc kubenswrapper[4669]: W1001 11:46:34.539736 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d7e57e_eda0_4134_bfd3_ed2c0e4826bf.slice/crio-203fa7da6ec519cba27e2c5cf4b99cb2b3879549625d5cea5a2704f82a269ec4 WatchSource:0}: Error finding container 203fa7da6ec519cba27e2c5cf4b99cb2b3879549625d5cea5a2704f82a269ec4: Status 404 returned error can't find the container with id 203fa7da6ec519cba27e2c5cf4b99cb2b3879549625d5cea5a2704f82a269ec4 Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.680548 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"deb3368d-5a59-43f5-93df-5c4c45d00de2","Type":"ContainerDied","Data":"68ff0909928dcafe9ea63dc6e69106c70bc5ac41389ac2fbb03f9e25131a9e7c"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.680616 4669 scope.go:117] "RemoveContainer" containerID="8d55813d7e23902ec2a7f34e519f0fca2e7fbfd950e6d6e249920886ed0076de" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.680809 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.699572 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1ba96b9-d556-419e-a8a3-f90348499977","Type":"ContainerDied","Data":"fba6352b99c106c6d02c70921546ef2655086a2e387e4d8dda7182310ea81690"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.699579 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.703274 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" event={"ID":"14df8713-8fa5-482c-9280-af169783618d","Type":"ContainerStarted","Data":"33586e4f0f1d8ce18f6bbf88bef98388973959cb357a27edfbb397a99d657caa"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.703333 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" event={"ID":"14df8713-8fa5-482c-9280-af169783618d","Type":"ContainerStarted","Data":"cb82deb75907c37cee028e7ff3961d689b434c878faeee7daf358b58dc7cb08c"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.717630 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" event={"ID":"e830daa9-ce44-48e3-8a0e-61a51a58b2b2","Type":"ContainerStarted","Data":"3f2b1a344e4cd849a66e7f8f80ab83bf8881c32632fe6c57cbe158548e0aeaf4"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.718779 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.733176 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75fdb4d7c7-7ltfb" event={"ID":"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf","Type":"ContainerStarted","Data":"203fa7da6ec519cba27e2c5cf4b99cb2b3879549625d5cea5a2704f82a269ec4"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.740239 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b7c87b994-mshrj" podStartSLOduration=2.9126302490000002 podStartE2EDuration="5.740202625s" podCreationTimestamp="2025-10-01 11:46:29 +0000 UTC" firstStartedPulling="2025-10-01 11:46:30.944739489 +0000 UTC m=+1082.044304456" lastFinishedPulling="2025-10-01 11:46:33.772311855 +0000 UTC m=+1084.871876832" observedRunningTime="2025-10-01 11:46:34.733660086 +0000 UTC m=+1085.833225063" watchObservedRunningTime="2025-10-01 11:46:34.740202625 +0000 UTC m=+1085.839767602" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.755209 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.763647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b8d32351-4bba-4ea7-b8f1-69442fe50ac3","Type":"ContainerDied","Data":"c6c122a50db014eb27087cfd18562df832fa9e7cca5138de27e74fed4f6139f2"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.776557 4669 scope.go:117] "RemoveContainer" containerID="7580566c60218d033d23c8abcfd189bbc5a2fe0f47019565fbfe1ef84051736c" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.792115 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.825873 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.840127 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b6d46dff-gdp9m" event={"ID":"c2f34b06-3e5b-4380-8b38-4c9be553dc00","Type":"ContainerStarted","Data":"c968e509d60154d417d19ab727a3b8bb20a8ed176f5f7caba5d5047aad905dfe"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.840583 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b6d46dff-gdp9m" event={"ID":"c2f34b06-3e5b-4380-8b38-4c9be553dc00","Type":"ContainerStarted","Data":"3a9beb143b65c6ab18bfb286b0079968b198945addb39e4b01a19128d0f1ab9e"} Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875039 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875543 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875568 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875598 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875608 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875649 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="proxy-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875658 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="proxy-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875668 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="sg-core" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875675 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="sg-core" Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875691 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-log" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875697 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-log" Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875719 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="ceilometer-notification-agent" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875728 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="ceilometer-notification-agent" Oct 01 11:46:34 crc kubenswrapper[4669]: E1001 11:46:34.875742 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-log" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875749 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-log" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875940 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="ceilometer-notification-agent" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875953 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-log" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875964 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875974 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" containerName="glance-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.875994 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="sg-core" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.876007 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" containerName="glance-log" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.876020 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" containerName="proxy-httpd" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.880226 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.896823 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.897122 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.897269 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wkq9x" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.897381 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939014 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939111 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939269 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939299 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939358 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939390 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mhwg\" (UniqueName: \"kubernetes.io/projected/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-kube-api-access-2mhwg\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939447 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-logs\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.939582 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.954665 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:34 crc kubenswrapper[4669]: I1001 11:46:34.978265 4669 scope.go:117] "RemoveContainer" containerID="7a42f86760e35df717ae00f2122bb68d51f1215918edfb0c0c8a2909da3abae8" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:34.996813 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.023777 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.038762 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" podStartSLOduration=6.038735274 podStartE2EDuration="6.038735274s" podCreationTimestamp="2025-10-01 11:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:34.897643284 +0000 UTC m=+1085.997208261" watchObservedRunningTime="2025-10-01 11:46:35.038735274 +0000 UTC m=+1086.138300251" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.042274 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.042357 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.046698 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.048769 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.048811 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.048928 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.048997 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mhwg\" (UniqueName: \"kubernetes.io/projected/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-kube-api-access-2mhwg\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.049092 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-logs\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.049358 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.054230 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.056706 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.057286 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.060493 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.060888 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-logs\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.061211 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.066119 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.069222 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.069307 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.079967 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.082715 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.086582 4669 scope.go:117] "RemoveContainer" containerID="29a5f95506edd88a3900874b31b4f2fd1debe97b135916f4acefaf0c6ec2da85" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.094566 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mhwg\" (UniqueName: \"kubernetes.io/projected/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-kube-api-access-2mhwg\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.127502 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.127590 4669 scope.go:117] "RemoveContainer" containerID="665f62049f18fe7951920d0471c64261f10054396f5f92b033f79ab723ab1d3c" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.138644 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-84b6d46dff-gdp9m" podStartSLOduration=3.35471641 podStartE2EDuration="6.13861341s" podCreationTimestamp="2025-10-01 11:46:29 +0000 UTC" firstStartedPulling="2025-10-01 11:46:30.963556048 +0000 UTC m=+1082.063121025" lastFinishedPulling="2025-10-01 11:46:33.747453048 +0000 UTC m=+1084.847018025" observedRunningTime="2025-10-01 11:46:34.976680271 +0000 UTC m=+1086.076245248" watchObservedRunningTime="2025-10-01 11:46:35.13861341 +0000 UTC m=+1086.238178387" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.152979 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-log-httpd\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.153040 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.153269 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-scripts\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.153311 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-run-httpd\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.153375 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.153393 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xrk\" (UniqueName: \"kubernetes.io/projected/070d0729-602a-4401-ad53-e721f87a447c-kube-api-access-94xrk\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.153414 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-config-data\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.160352 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.167438 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.170354 4669 scope.go:117] "RemoveContainer" containerID="7897b5906b2427beeaf96977db8c85ee03f33d6d1cfc1909fa0668c23e608297" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.177277 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.181858 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.186317 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.189148 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.189217 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.195274 4669 scope.go:117] "RemoveContainer" containerID="e73cfd3b900b9c3d2c5c3300cc74cbb9d6530fb00c558b80d22b1835b56d0c45" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.253531 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.254728 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255616 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255659 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94xrk\" (UniqueName: \"kubernetes.io/projected/070d0729-602a-4401-ad53-e721f87a447c-kube-api-access-94xrk\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255686 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-config-data\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255719 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255863 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-log-httpd\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255893 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.255936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256100 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256123 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5rz\" (UniqueName: \"kubernetes.io/projected/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-kube-api-access-2j5rz\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256329 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-scripts\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256366 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-logs\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256409 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-run-httpd\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.256441 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.258007 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-log-httpd\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.258720 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-run-httpd\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.261360 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.264748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-scripts\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.270483 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.271041 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-config-data\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.287763 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xrk\" (UniqueName: \"kubernetes.io/projected/070d0729-602a-4401-ad53-e721f87a447c-kube-api-access-94xrk\") pod \"ceilometer-0\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.358030 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.359264 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.359196 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360063 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360155 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360178 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5rz\" (UniqueName: \"kubernetes.io/projected/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-kube-api-access-2j5rz\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360234 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360264 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-logs\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360296 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.360733 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.364370 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-logs\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.369215 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.370506 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.388438 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.390177 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5rz\" (UniqueName: \"kubernetes.io/projected/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-kube-api-access-2j5rz\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.398308 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.431345 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.432034 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.520949 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.666934 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d32351-4bba-4ea7-b8f1-69442fe50ac3" path="/var/lib/kubelet/pods/b8d32351-4bba-4ea7-b8f1-69442fe50ac3/volumes" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.668116 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb3368d-5a59-43f5-93df-5c4c45d00de2" path="/var/lib/kubelet/pods/deb3368d-5a59-43f5-93df-5c4c45d00de2/volumes" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.670744 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ba96b9-d556-419e-a8a3-f90348499977" path="/var/lib/kubelet/pods/e1ba96b9-d556-419e-a8a3-f90348499977/volumes" Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.942003 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h6rw6" event={"ID":"db2bb6cb-ab40-4534-967e-c71b62323512","Type":"ContainerStarted","Data":"f67efb9a0c894ae8c54a4f8cfe6605d05de698063188f932a48676a902d4fa02"} Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.999404 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.999557 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75fdb4d7c7-7ltfb" event={"ID":"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf","Type":"ContainerStarted","Data":"752ddd1fe66d90e357c0a5c270cc8502a2ad49bd624e54bdeb0b6c9e7652dc40"} Oct 01 11:46:35 crc kubenswrapper[4669]: I1001 11:46:35.999595 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75fdb4d7c7-7ltfb" event={"ID":"74d7e57e-eda0-4134-bfd3-ed2c0e4826bf","Type":"ContainerStarted","Data":"9a8aa66873d718a6a6e76fb9acc65e15fa89928dde9a9b28148e5b36716dd1ef"} Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.001888 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.034831 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-h6rw6" podStartSLOduration=18.226589882 podStartE2EDuration="56.034799191s" podCreationTimestamp="2025-10-01 11:45:40 +0000 UTC" firstStartedPulling="2025-10-01 11:45:55.939398993 +0000 UTC m=+1047.038963970" lastFinishedPulling="2025-10-01 11:46:33.747608302 +0000 UTC m=+1084.847173279" observedRunningTime="2025-10-01 11:46:35.971730774 +0000 UTC m=+1087.071295751" watchObservedRunningTime="2025-10-01 11:46:36.034799191 +0000 UTC m=+1087.134364168" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.079167 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75fdb4d7c7-7ltfb" podStartSLOduration=4.079139343 podStartE2EDuration="4.079139343s" podCreationTimestamp="2025-10-01 11:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:36.038706427 +0000 UTC m=+1087.138271414" watchObservedRunningTime="2025-10-01 11:46:36.079139343 +0000 UTC m=+1087.178704320" Oct 01 11:46:36 crc kubenswrapper[4669]: W1001 11:46:36.090911 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod070d0729_602a_4401_ad53_e721f87a447c.slice/crio-22d05982856866330e43adca8f4e4fb5debf89033bcad62ce54c6a22ea76e110 WatchSource:0}: Error finding container 22d05982856866330e43adca8f4e4fb5debf89033bcad62ce54c6a22ea76e110: Status 404 returned error can't find the container with id 22d05982856866330e43adca8f4e4fb5debf89033bcad62ce54c6a22ea76e110 Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.126668 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.254067 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.464847 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84d474ff6b-jd7xf"] Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.466546 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.468563 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.471220 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.526785 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d474ff6b-jd7xf"] Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533608 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-combined-ca-bundle\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533669 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-config-data-custom\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533697 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-config-data\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533759 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zfhb\" (UniqueName: \"kubernetes.io/projected/90e4ab06-115b-4efa-9a11-d16218dec9e0-kube-api-access-2zfhb\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533780 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-internal-tls-certs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533797 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-public-tls-certs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.533844 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90e4ab06-115b-4efa-9a11-d16218dec9e0-logs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.636476 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90e4ab06-115b-4efa-9a11-d16218dec9e0-logs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.636628 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-combined-ca-bundle\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.637146 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90e4ab06-115b-4efa-9a11-d16218dec9e0-logs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.636662 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-config-data-custom\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.637860 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-config-data\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.637921 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zfhb\" (UniqueName: \"kubernetes.io/projected/90e4ab06-115b-4efa-9a11-d16218dec9e0-kube-api-access-2zfhb\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.637946 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-internal-tls-certs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.637967 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-public-tls-certs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.650920 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-config-data-custom\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.651116 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-config-data\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.656754 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-internal-tls-certs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.673162 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zfhb\" (UniqueName: \"kubernetes.io/projected/90e4ab06-115b-4efa-9a11-d16218dec9e0-kube-api-access-2zfhb\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.676065 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-combined-ca-bundle\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.680557 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e4ab06-115b-4efa-9a11-d16218dec9e0-public-tls-certs\") pod \"barbican-api-84d474ff6b-jd7xf\" (UID: \"90e4ab06-115b-4efa-9a11-d16218dec9e0\") " pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:36 crc kubenswrapper[4669]: I1001 11:46:36.862487 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:37 crc kubenswrapper[4669]: I1001 11:46:37.033555 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3e7cb3e-aea7-4369-91a7-ccdaf2531415","Type":"ContainerStarted","Data":"09151fd19778a6fa820f6cce3c3ea829df696b93b20481ef96467db57af8311c"} Oct 01 11:46:37 crc kubenswrapper[4669]: I1001 11:46:37.037180 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerStarted","Data":"22d05982856866330e43adca8f4e4fb5debf89033bcad62ce54c6a22ea76e110"} Oct 01 11:46:37 crc kubenswrapper[4669]: I1001 11:46:37.058761 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e","Type":"ContainerStarted","Data":"8d0acaf38053c80e1c53c4f04e51b371236b4aa9b2de2e8e32685bf69e3d19d3"} Oct 01 11:46:37 crc kubenswrapper[4669]: I1001 11:46:37.345944 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d474ff6b-jd7xf"] Oct 01 11:46:38 crc kubenswrapper[4669]: I1001 11:46:38.075618 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e","Type":"ContainerStarted","Data":"8b2221caba3f5623b222afb7931d94552eafe6fabfc0de0b300e59c9971a63d8"} Oct 01 11:46:38 crc kubenswrapper[4669]: I1001 11:46:38.078657 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3e7cb3e-aea7-4369-91a7-ccdaf2531415","Type":"ContainerStarted","Data":"3ff0027a5d4bf893f4821b994e546cad17f5cfdaaf0fa568764fcf16ddf16d3f"} Oct 01 11:46:38 crc kubenswrapper[4669]: I1001 11:46:38.080405 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d474ff6b-jd7xf" event={"ID":"90e4ab06-115b-4efa-9a11-d16218dec9e0","Type":"ContainerStarted","Data":"6b7c5907ad39bc3f1f8ab82e30c60ab75c704bbdc7bc9395d2ec2a03665884e5"} Oct 01 11:46:38 crc kubenswrapper[4669]: I1001 11:46:38.080434 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d474ff6b-jd7xf" event={"ID":"90e4ab06-115b-4efa-9a11-d16218dec9e0","Type":"ContainerStarted","Data":"5ada47ef1c4d002c19a699cbcbd449e75aa2b1f7a2e64b1e4d0f354c0cccf4e1"} Oct 01 11:46:38 crc kubenswrapper[4669]: I1001 11:46:38.083144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerStarted","Data":"3f8f1272aef600a2c67d9d6be60e69f39c589a20ced292505152864eb7e3450b"} Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.112122 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3e7cb3e-aea7-4369-91a7-ccdaf2531415","Type":"ContainerStarted","Data":"b5eacb4ec5ca11175d7373e4c1b4fe7356249a8933437d7c1394c5af780dabc0"} Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.121190 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d474ff6b-jd7xf" event={"ID":"90e4ab06-115b-4efa-9a11-d16218dec9e0","Type":"ContainerStarted","Data":"0346b1d252bed5af1caac536db5fbadfd2091a933fba50513748f3d7f7be89ad"} Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.126304 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e","Type":"ContainerStarted","Data":"4a46420043020db08e7e7eda689a74f780f828af0ee6c273152e621e55ad694e"} Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.160924 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.160896379 podStartE2EDuration="6.160896379s" podCreationTimestamp="2025-10-01 11:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:40.14244827 +0000 UTC m=+1091.242013237" watchObservedRunningTime="2025-10-01 11:46:40.160896379 +0000 UTC m=+1091.260461356" Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.177258 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.195499 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84d474ff6b-jd7xf" podStartSLOduration=4.195473123 podStartE2EDuration="4.195473123s" podCreationTimestamp="2025-10-01 11:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:40.192657853 +0000 UTC m=+1091.292222820" watchObservedRunningTime="2025-10-01 11:46:40.195473123 +0000 UTC m=+1091.295038100" Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.200780 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.200755591 podStartE2EDuration="5.200755591s" podCreationTimestamp="2025-10-01 11:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:40.174367217 +0000 UTC m=+1091.273932204" watchObservedRunningTime="2025-10-01 11:46:40.200755591 +0000 UTC m=+1091.300320568" Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.280885 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mjq5j"] Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.296580 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerName="dnsmasq-dns" containerID="cri-o://e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da" gracePeriod=10 Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.798638 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.902144 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-config\") pod \"1f2a482e-2660-474a-9e8f-28c2d9b75648\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.902347 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-nb\") pod \"1f2a482e-2660-474a-9e8f-28c2d9b75648\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.902426 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-sb\") pod \"1f2a482e-2660-474a-9e8f-28c2d9b75648\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.902496 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-swift-storage-0\") pod \"1f2a482e-2660-474a-9e8f-28c2d9b75648\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.902549 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kp2s\" (UniqueName: \"kubernetes.io/projected/1f2a482e-2660-474a-9e8f-28c2d9b75648-kube-api-access-7kp2s\") pod \"1f2a482e-2660-474a-9e8f-28c2d9b75648\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.902633 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-svc\") pod \"1f2a482e-2660-474a-9e8f-28c2d9b75648\" (UID: \"1f2a482e-2660-474a-9e8f-28c2d9b75648\") " Oct 01 11:46:40 crc kubenswrapper[4669]: I1001 11:46:40.915308 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2a482e-2660-474a-9e8f-28c2d9b75648-kube-api-access-7kp2s" (OuterVolumeSpecName: "kube-api-access-7kp2s") pod "1f2a482e-2660-474a-9e8f-28c2d9b75648" (UID: "1f2a482e-2660-474a-9e8f-28c2d9b75648"). InnerVolumeSpecName "kube-api-access-7kp2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.006600 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kp2s\" (UniqueName: \"kubernetes.io/projected/1f2a482e-2660-474a-9e8f-28c2d9b75648-kube-api-access-7kp2s\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.080089 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f2a482e-2660-474a-9e8f-28c2d9b75648" (UID: "1f2a482e-2660-474a-9e8f-28c2d9b75648"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.080123 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f2a482e-2660-474a-9e8f-28c2d9b75648" (UID: "1f2a482e-2660-474a-9e8f-28c2d9b75648"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.080094 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-config" (OuterVolumeSpecName: "config") pod "1f2a482e-2660-474a-9e8f-28c2d9b75648" (UID: "1f2a482e-2660-474a-9e8f-28c2d9b75648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.080972 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f2a482e-2660-474a-9e8f-28c2d9b75648" (UID: "1f2a482e-2660-474a-9e8f-28c2d9b75648"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.081668 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f2a482e-2660-474a-9e8f-28c2d9b75648" (UID: "1f2a482e-2660-474a-9e8f-28c2d9b75648"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.110231 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.111342 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.111639 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.111776 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.111851 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2a482e-2660-474a-9e8f-28c2d9b75648-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.140115 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerStarted","Data":"f14f59a06358c14e46fee0297a491f0970c32a227d14e4c5b4d5e78d81d82f19"} Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.140175 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerStarted","Data":"5131c7251e0298e73104d363ae17a23fe62dbfe24b5c66aa6f9dc78a313b01bb"} Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.142200 4669 generic.go:334] "Generic (PLEG): container finished" podID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerID="e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da" exitCode=0 Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.143315 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.145186 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" event={"ID":"1f2a482e-2660-474a-9e8f-28c2d9b75648","Type":"ContainerDied","Data":"e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da"} Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.145256 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mjq5j" event={"ID":"1f2a482e-2660-474a-9e8f-28c2d9b75648","Type":"ContainerDied","Data":"eadfb62822dc286ff82b5130b1cb9eac90209a0a9253b36e0c732de9395cf0ae"} Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.145288 4669 scope.go:117] "RemoveContainer" containerID="e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.145862 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.147135 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.182588 4669 scope.go:117] "RemoveContainer" containerID="7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.191713 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mjq5j"] Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.202416 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mjq5j"] Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.208685 4669 scope.go:117] "RemoveContainer" containerID="e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da" Oct 01 11:46:41 crc kubenswrapper[4669]: E1001 11:46:41.209322 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da\": container with ID starting with e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da not found: ID does not exist" containerID="e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.209376 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da"} err="failed to get container status \"e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da\": rpc error: code = NotFound desc = could not find container \"e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da\": container with ID starting with e7f786cfd4b3b81b684afce1310c15ad66c3051fcddf1f5275bb1518addb44da not found: ID does not exist" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.209412 4669 scope.go:117] "RemoveContainer" containerID="7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33" Oct 01 11:46:41 crc kubenswrapper[4669]: E1001 11:46:41.210543 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33\": container with ID starting with 7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33 not found: ID does not exist" containerID="7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.210572 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33"} err="failed to get container status \"7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33\": rpc error: code = NotFound desc = could not find container \"7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33\": container with ID starting with 7b4ffc80ec3be32f758dbb45e373fda41638fb7e334558ff413d468d9882ba33 not found: ID does not exist" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.656688 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" path="/var/lib/kubelet/pods/1f2a482e-2660-474a-9e8f-28c2d9b75648/volumes" Oct 01 11:46:41 crc kubenswrapper[4669]: I1001 11:46:41.845765 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:42 crc kubenswrapper[4669]: I1001 11:46:42.036147 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:43 crc kubenswrapper[4669]: I1001 11:46:43.173515 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerStarted","Data":"95b8385e616503734b5a9cc1a953a6e77cf347a538e902767ea66fb96fc4b63d"} Oct 01 11:46:43 crc kubenswrapper[4669]: I1001 11:46:43.174061 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:46:43 crc kubenswrapper[4669]: I1001 11:46:43.208553 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.40019156 podStartE2EDuration="9.20853155s" podCreationTimestamp="2025-10-01 11:46:34 +0000 UTC" firstStartedPulling="2025-10-01 11:46:36.096704921 +0000 UTC m=+1087.196269898" lastFinishedPulling="2025-10-01 11:46:42.905044911 +0000 UTC m=+1094.004609888" observedRunningTime="2025-10-01 11:46:43.204626036 +0000 UTC m=+1094.304191013" watchObservedRunningTime="2025-10-01 11:46:43.20853155 +0000 UTC m=+1094.308096527" Oct 01 11:46:43 crc kubenswrapper[4669]: I1001 11:46:43.862055 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:44 crc kubenswrapper[4669]: I1001 11:46:44.190149 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h6rw6" event={"ID":"db2bb6cb-ab40-4534-967e-c71b62323512","Type":"ContainerDied","Data":"f67efb9a0c894ae8c54a4f8cfe6605d05de698063188f932a48676a902d4fa02"} Oct 01 11:46:44 crc kubenswrapper[4669]: I1001 11:46:44.190046 4669 generic.go:334] "Generic (PLEG): container finished" podID="db2bb6cb-ab40-4534-967e-c71b62323512" containerID="f67efb9a0c894ae8c54a4f8cfe6605d05de698063188f932a48676a902d4fa02" exitCode=0 Oct 01 11:46:44 crc kubenswrapper[4669]: I1001 11:46:44.361610 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-866c85f5d8-mvd64" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.254409 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.256259 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.295580 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.322290 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.522224 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.522285 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.581976 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.585278 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.658823 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.738886 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-combined-ca-bundle\") pod \"db2bb6cb-ab40-4534-967e-c71b62323512\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.738948 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-scripts\") pod \"db2bb6cb-ab40-4534-967e-c71b62323512\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.738984 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-config-data\") pod \"db2bb6cb-ab40-4534-967e-c71b62323512\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.739028 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv22s\" (UniqueName: \"kubernetes.io/projected/db2bb6cb-ab40-4534-967e-c71b62323512-kube-api-access-mv22s\") pod \"db2bb6cb-ab40-4534-967e-c71b62323512\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.739164 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-db-sync-config-data\") pod \"db2bb6cb-ab40-4534-967e-c71b62323512\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.739218 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db2bb6cb-ab40-4534-967e-c71b62323512-etc-machine-id\") pod \"db2bb6cb-ab40-4534-967e-c71b62323512\" (UID: \"db2bb6cb-ab40-4534-967e-c71b62323512\") " Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.742767 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db2bb6cb-ab40-4534-967e-c71b62323512-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "db2bb6cb-ab40-4534-967e-c71b62323512" (UID: "db2bb6cb-ab40-4534-967e-c71b62323512"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.749569 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "db2bb6cb-ab40-4534-967e-c71b62323512" (UID: "db2bb6cb-ab40-4534-967e-c71b62323512"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.749597 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-scripts" (OuterVolumeSpecName: "scripts") pod "db2bb6cb-ab40-4534-967e-c71b62323512" (UID: "db2bb6cb-ab40-4534-967e-c71b62323512"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.750045 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2bb6cb-ab40-4534-967e-c71b62323512-kube-api-access-mv22s" (OuterVolumeSpecName: "kube-api-access-mv22s") pod "db2bb6cb-ab40-4534-967e-c71b62323512" (UID: "db2bb6cb-ab40-4534-967e-c71b62323512"). InnerVolumeSpecName "kube-api-access-mv22s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.778713 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db2bb6cb-ab40-4534-967e-c71b62323512" (UID: "db2bb6cb-ab40-4534-967e-c71b62323512"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.789697 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d474ff6b-jd7xf" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.825398 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-config-data" (OuterVolumeSpecName: "config-data") pod "db2bb6cb-ab40-4534-967e-c71b62323512" (UID: "db2bb6cb-ab40-4534-967e-c71b62323512"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.841763 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.841803 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.841816 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.841830 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv22s\" (UniqueName: \"kubernetes.io/projected/db2bb6cb-ab40-4534-967e-c71b62323512-kube-api-access-mv22s\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.841843 4669 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db2bb6cb-ab40-4534-967e-c71b62323512-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.841854 4669 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db2bb6cb-ab40-4534-967e-c71b62323512-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.903854 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fbb868f5d-l9gnp"] Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.904988 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fbb868f5d-l9gnp" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api-log" containerID="cri-o://372a8648a796d20fe068b4ce1c4d817639caf884cb78f9712e2eab7b93a5788f" gracePeriod=30 Oct 01 11:46:45 crc kubenswrapper[4669]: I1001 11:46:45.905814 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fbb868f5d-l9gnp" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api" containerID="cri-o://c2add750e836c213a247c06abd2f27660764e73057e068ea011873a044d4f9c0" gracePeriod=30 Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.221512 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h6rw6" event={"ID":"db2bb6cb-ab40-4534-967e-c71b62323512","Type":"ContainerDied","Data":"4e4194e0182535e53e4af094857a4c8aca1f483a1262bbeb8b301d806b6a2ccc"} Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.221567 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e4194e0182535e53e4af094857a4c8aca1f483a1262bbeb8b301d806b6a2ccc" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.221678 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h6rw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.240869 4669 generic.go:334] "Generic (PLEG): container finished" podID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerID="372a8648a796d20fe068b4ce1c4d817639caf884cb78f9712e2eab7b93a5788f" exitCode=143 Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.241962 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbb868f5d-l9gnp" event={"ID":"e4b949b8-5f5e-4f46-836c-7be0991a67d9","Type":"ContainerDied","Data":"372a8648a796d20fe068b4ce1c4d817639caf884cb78f9712e2eab7b93a5788f"} Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.244336 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.244378 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.244393 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.244408 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.568677 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:46:46 crc kubenswrapper[4669]: E1001 11:46:46.577666 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2bb6cb-ab40-4534-967e-c71b62323512" containerName="cinder-db-sync" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.577775 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2bb6cb-ab40-4534-967e-c71b62323512" containerName="cinder-db-sync" Oct 01 11:46:46 crc kubenswrapper[4669]: E1001 11:46:46.577861 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerName="init" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.579487 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerName="init" Oct 01 11:46:46 crc kubenswrapper[4669]: E1001 11:46:46.579563 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerName="dnsmasq-dns" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.579620 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerName="dnsmasq-dns" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.580038 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2bb6cb-ab40-4534-967e-c71b62323512" containerName="cinder-db-sync" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.580401 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2a482e-2660-474a-9e8f-28c2d9b75648" containerName="dnsmasq-dns" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.587923 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.592725 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.593248 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.593457 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.593693 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-c5nsd" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.594978 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.617417 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w5zw6"] Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.621992 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.673766 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.673853 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.673894 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.673957 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.674055 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbds\" (UniqueName: \"kubernetes.io/projected/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-kube-api-access-2xbds\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.674130 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-scripts\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.689531 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w5zw6"] Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776560 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776634 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-svc\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776685 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776724 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776769 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776823 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4w8g\" (UniqueName: \"kubernetes.io/projected/af3be2a0-85e4-4833-89c9-4450ee2e5635-kube-api-access-t4w8g\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776854 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776884 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776911 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-config\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.776961 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbds\" (UniqueName: \"kubernetes.io/projected/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-kube-api-access-2xbds\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.777000 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-scripts\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.777068 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.778731 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.785986 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-scripts\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.796422 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.797625 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.801854 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.806822 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbds\" (UniqueName: \"kubernetes.io/projected/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-kube-api-access-2xbds\") pod \"cinder-scheduler-0\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.846267 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.848435 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.853180 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.879584 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.879680 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-svc\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.879729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.879814 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4w8g\" (UniqueName: \"kubernetes.io/projected/af3be2a0-85e4-4833-89c9-4450ee2e5635-kube-api-access-t4w8g\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.879864 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.879891 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-config\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.881011 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-config\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.881731 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.882437 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-svc\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.882977 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.883837 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.883871 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.930051 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4w8g\" (UniqueName: \"kubernetes.io/projected/af3be2a0-85e4-4833-89c9-4450ee2e5635-kube-api-access-t4w8g\") pod \"dnsmasq-dns-5784cf869f-w5zw6\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.976741 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983253 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983302 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f86837-fdf6-41f7-be99-96311790f5de-logs\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983385 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-scripts\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983422 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j9gx\" (UniqueName: \"kubernetes.io/projected/a0f86837-fdf6-41f7-be99-96311790f5de-kube-api-access-2j9gx\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983455 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983475 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.983507 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f86837-fdf6-41f7-be99-96311790f5de-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:46 crc kubenswrapper[4669]: I1001 11:46:46.991793 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086007 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f86837-fdf6-41f7-be99-96311790f5de-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086121 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f86837-fdf6-41f7-be99-96311790f5de-logs\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086143 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086210 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-scripts\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086242 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j9gx\" (UniqueName: \"kubernetes.io/projected/a0f86837-fdf6-41f7-be99-96311790f5de-kube-api-access-2j9gx\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086270 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.086286 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.087583 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f86837-fdf6-41f7-be99-96311790f5de-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.088575 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f86837-fdf6-41f7-be99-96311790f5de-logs\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.093556 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.095173 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.097314 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-scripts\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.106015 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.118799 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j9gx\" (UniqueName: \"kubernetes.io/projected/a0f86837-fdf6-41f7-be99-96311790f5de-kube-api-access-2j9gx\") pod \"cinder-api-0\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.219727 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.663270 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.789723 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w5zw6"] Oct 01 11:46:47 crc kubenswrapper[4669]: I1001 11:46:47.954717 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:47 crc kubenswrapper[4669]: W1001 11:46:47.967170 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f86837_fdf6_41f7_be99_96311790f5de.slice/crio-d4ed05bf75ff258f0b756858a4719bc3609a5386257cf415dbda1914b0fb3546 WatchSource:0}: Error finding container d4ed05bf75ff258f0b756858a4719bc3609a5386257cf415dbda1914b0fb3546: Status 404 returned error can't find the container with id d4ed05bf75ff258f0b756858a4719bc3609a5386257cf415dbda1914b0fb3546 Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.289854 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"59d42f69-5e5d-498b-a47b-c2c035bb3cf4","Type":"ContainerStarted","Data":"837c9373764c53d6f10eb59b941e1139b91877422c1bf78627eb0217c4700eb9"} Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.297316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f86837-fdf6-41f7-be99-96311790f5de","Type":"ContainerStarted","Data":"d4ed05bf75ff258f0b756858a4719bc3609a5386257cf415dbda1914b0fb3546"} Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.307975 4669 generic.go:334] "Generic (PLEG): container finished" podID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerID="8b6efcae2431858e2175af34979ae7bc0ee9a353088e486520fafabf8b37dc0c" exitCode=0 Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.308036 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" event={"ID":"af3be2a0-85e4-4833-89c9-4450ee2e5635","Type":"ContainerDied","Data":"8b6efcae2431858e2175af34979ae7bc0ee9a353088e486520fafabf8b37dc0c"} Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.308066 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" event={"ID":"af3be2a0-85e4-4833-89c9-4450ee2e5635","Type":"ContainerStarted","Data":"08d006677944b8b465e92beaeeee845682387e7e5f54b447974e8c0de3148786"} Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.847422 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:48 crc kubenswrapper[4669]: I1001 11:46:48.847951 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.248199 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.408294 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.408421 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.410243 4669 generic.go:334] "Generic (PLEG): container finished" podID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerID="c2add750e836c213a247c06abd2f27660764e73057e068ea011873a044d4f9c0" exitCode=0 Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.410360 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbb868f5d-l9gnp" event={"ID":"e4b949b8-5f5e-4f46-836c-7be0991a67d9","Type":"ContainerDied","Data":"c2add750e836c213a247c06abd2f27660764e73057e068ea011873a044d4f9c0"} Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.435003 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 11:46:49 crc kubenswrapper[4669]: I1001 11:46:49.954287 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.095744 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data\") pod \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.095822 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-combined-ca-bundle\") pod \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.095990 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2qbd\" (UniqueName: \"kubernetes.io/projected/e4b949b8-5f5e-4f46-836c-7be0991a67d9-kube-api-access-h2qbd\") pod \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.096152 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data-custom\") pod \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.096177 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b949b8-5f5e-4f46-836c-7be0991a67d9-logs\") pod \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\" (UID: \"e4b949b8-5f5e-4f46-836c-7be0991a67d9\") " Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.097328 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b949b8-5f5e-4f46-836c-7be0991a67d9-logs" (OuterVolumeSpecName: "logs") pod "e4b949b8-5f5e-4f46-836c-7be0991a67d9" (UID: "e4b949b8-5f5e-4f46-836c-7be0991a67d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.122227 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4b949b8-5f5e-4f46-836c-7be0991a67d9" (UID: "e4b949b8-5f5e-4f46-836c-7be0991a67d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.131389 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b949b8-5f5e-4f46-836c-7be0991a67d9-kube-api-access-h2qbd" (OuterVolumeSpecName: "kube-api-access-h2qbd") pod "e4b949b8-5f5e-4f46-836c-7be0991a67d9" (UID: "e4b949b8-5f5e-4f46-836c-7be0991a67d9"). InnerVolumeSpecName "kube-api-access-h2qbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.198619 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2qbd\" (UniqueName: \"kubernetes.io/projected/e4b949b8-5f5e-4f46-836c-7be0991a67d9-kube-api-access-h2qbd\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.199211 4669 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.199223 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b949b8-5f5e-4f46-836c-7be0991a67d9-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.199322 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data" (OuterVolumeSpecName: "config-data") pod "e4b949b8-5f5e-4f46-836c-7be0991a67d9" (UID: "e4b949b8-5f5e-4f46-836c-7be0991a67d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.229453 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4b949b8-5f5e-4f46-836c-7be0991a67d9" (UID: "e4b949b8-5f5e-4f46-836c-7be0991a67d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.304985 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.305021 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b949b8-5f5e-4f46-836c-7be0991a67d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.457688 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f86837-fdf6-41f7-be99-96311790f5de","Type":"ContainerStarted","Data":"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8"} Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.462153 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" event={"ID":"af3be2a0-85e4-4833-89c9-4450ee2e5635","Type":"ContainerStarted","Data":"a0680eea8c03ebff6a0993e3f57b30a42388ff869bfb969138f837828631b85c"} Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.462317 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.471343 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbb868f5d-l9gnp" event={"ID":"e4b949b8-5f5e-4f46-836c-7be0991a67d9","Type":"ContainerDied","Data":"4b73f7d0a3f16ee82a4eeac8fc7804552571c8ac7348ec15ea03dbc63bf9c468"} Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.471404 4669 scope.go:117] "RemoveContainer" containerID="c2add750e836c213a247c06abd2f27660764e73057e068ea011873a044d4f9c0" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.471543 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbb868f5d-l9gnp" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.494725 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"59d42f69-5e5d-498b-a47b-c2c035bb3cf4","Type":"ContainerStarted","Data":"b1cd9f0e242b16755a3f6a3ea4dc0fc22066c190f2bdac9d7dee651f0bafd1bb"} Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.507610 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.526427 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" podStartSLOduration=4.526399365 podStartE2EDuration="4.526399365s" podCreationTimestamp="2025-10-01 11:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:50.506715625 +0000 UTC m=+1101.606280612" watchObservedRunningTime="2025-10-01 11:46:50.526399365 +0000 UTC m=+1101.625964342" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.550983 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fbb868f5d-l9gnp"] Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.552362 4669 scope.go:117] "RemoveContainer" containerID="372a8648a796d20fe068b4ce1c4d817639caf884cb78f9712e2eab7b93a5788f" Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.559624 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5fbb868f5d-l9gnp"] Oct 01 11:46:50 crc kubenswrapper[4669]: I1001 11:46:50.722844 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d99769bb4-lq4fx" Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.506707 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f86837-fdf6-41f7-be99-96311790f5de","Type":"ContainerStarted","Data":"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8"} Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.507628 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api-log" containerID="cri-o://e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8" gracePeriod=30 Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.508072 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.508456 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api" containerID="cri-o://2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8" gracePeriod=30 Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.518123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"59d42f69-5e5d-498b-a47b-c2c035bb3cf4","Type":"ContainerStarted","Data":"e554f9b57f604085cea48b13e2fda2c8e41278e1a63c4585dc28991f1411876e"} Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.531109 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.531083983 podStartE2EDuration="5.531083983s" podCreationTimestamp="2025-10-01 11:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:51.526418018 +0000 UTC m=+1102.625983005" watchObservedRunningTime="2025-10-01 11:46:51.531083983 +0000 UTC m=+1102.630648960" Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.557560 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.405003125 podStartE2EDuration="5.557535447s" podCreationTimestamp="2025-10-01 11:46:46 +0000 UTC" firstStartedPulling="2025-10-01 11:46:47.716690295 +0000 UTC m=+1098.816255272" lastFinishedPulling="2025-10-01 11:46:48.869222617 +0000 UTC m=+1099.968787594" observedRunningTime="2025-10-01 11:46:51.553448878 +0000 UTC m=+1102.653013875" watchObservedRunningTime="2025-10-01 11:46:51.557535447 +0000 UTC m=+1102.657100424" Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.657400 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" path="/var/lib/kubelet/pods/e4b949b8-5f5e-4f46-836c-7be0991a67d9/volumes" Oct 01 11:46:51 crc kubenswrapper[4669]: I1001 11:46:51.977996 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.176582 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269277 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j9gx\" (UniqueName: \"kubernetes.io/projected/a0f86837-fdf6-41f7-be99-96311790f5de-kube-api-access-2j9gx\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269345 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269377 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-scripts\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269433 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f86837-fdf6-41f7-be99-96311790f5de-logs\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269583 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f86837-fdf6-41f7-be99-96311790f5de-etc-machine-id\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269613 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data-custom\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.269670 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-combined-ca-bundle\") pod \"a0f86837-fdf6-41f7-be99-96311790f5de\" (UID: \"a0f86837-fdf6-41f7-be99-96311790f5de\") " Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.270854 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f86837-fdf6-41f7-be99-96311790f5de-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.271145 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f86837-fdf6-41f7-be99-96311790f5de-logs" (OuterVolumeSpecName: "logs") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.279334 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.280346 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-scripts" (OuterVolumeSpecName: "scripts") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.283408 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f86837-fdf6-41f7-be99-96311790f5de-kube-api-access-2j9gx" (OuterVolumeSpecName: "kube-api-access-2j9gx") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "kube-api-access-2j9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.346335 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.361625 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data" (OuterVolumeSpecName: "config-data") pod "a0f86837-fdf6-41f7-be99-96311790f5de" (UID: "a0f86837-fdf6-41f7-be99-96311790f5de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372278 4669 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f86837-fdf6-41f7-be99-96311790f5de-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372311 4669 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372321 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372330 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j9gx\" (UniqueName: \"kubernetes.io/projected/a0f86837-fdf6-41f7-be99-96311790f5de-kube-api-access-2j9gx\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372348 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372364 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f86837-fdf6-41f7-be99-96311790f5de-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.372373 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f86837-fdf6-41f7-be99-96311790f5de-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531617 4669 generic.go:334] "Generic (PLEG): container finished" podID="a0f86837-fdf6-41f7-be99-96311790f5de" containerID="2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8" exitCode=0 Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531660 4669 generic.go:334] "Generic (PLEG): container finished" podID="a0f86837-fdf6-41f7-be99-96311790f5de" containerID="e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8" exitCode=143 Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531729 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531736 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f86837-fdf6-41f7-be99-96311790f5de","Type":"ContainerDied","Data":"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8"} Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531820 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f86837-fdf6-41f7-be99-96311790f5de","Type":"ContainerDied","Data":"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8"} Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531835 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f86837-fdf6-41f7-be99-96311790f5de","Type":"ContainerDied","Data":"d4ed05bf75ff258f0b756858a4719bc3609a5386257cf415dbda1914b0fb3546"} Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.531861 4669 scope.go:117] "RemoveContainer" containerID="2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.571181 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.582237 4669 scope.go:117] "RemoveContainer" containerID="e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.585440 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.615451 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:52 crc kubenswrapper[4669]: E1001 11:46:52.616020 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616044 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api" Oct 01 11:46:52 crc kubenswrapper[4669]: E1001 11:46:52.616071 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616148 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api" Oct 01 11:46:52 crc kubenswrapper[4669]: E1001 11:46:52.616169 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api-log" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616177 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api-log" Oct 01 11:46:52 crc kubenswrapper[4669]: E1001 11:46:52.616199 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api-log" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616205 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api-log" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616426 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616442 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" containerName="cinder-api-log" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616453 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api-log" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.616460 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b949b8-5f5e-4f46-836c-7be0991a67d9" containerName="barbican-api" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.617657 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.623790 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.624011 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.624141 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.631126 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.633719 4669 scope.go:117] "RemoveContainer" containerID="2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8" Oct 01 11:46:52 crc kubenswrapper[4669]: E1001 11:46:52.637202 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8\": container with ID starting with 2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8 not found: ID does not exist" containerID="2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.637251 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8"} err="failed to get container status \"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8\": rpc error: code = NotFound desc = could not find container \"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8\": container with ID starting with 2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8 not found: ID does not exist" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.637279 4669 scope.go:117] "RemoveContainer" containerID="e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8" Oct 01 11:46:52 crc kubenswrapper[4669]: E1001 11:46:52.637704 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8\": container with ID starting with e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8 not found: ID does not exist" containerID="e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.637728 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8"} err="failed to get container status \"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8\": rpc error: code = NotFound desc = could not find container \"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8\": container with ID starting with e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8 not found: ID does not exist" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.637741 4669 scope.go:117] "RemoveContainer" containerID="2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.638295 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8"} err="failed to get container status \"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8\": rpc error: code = NotFound desc = could not find container \"2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8\": container with ID starting with 2a76665579148df92b6b9e02f610807b55c332cf725de6b45c1dd8a5f26ce5b8 not found: ID does not exist" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.638312 4669 scope.go:117] "RemoveContainer" containerID="e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.638834 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8"} err="failed to get container status \"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8\": rpc error: code = NotFound desc = could not find container \"e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8\": container with ID starting with e9d4693ab55f06dab647974a3ffecb17fe0a873ba53b81639bdd7e75c079c8a8 not found: ID does not exist" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.785992 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznvd\" (UniqueName: \"kubernetes.io/projected/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-kube-api-access-zznvd\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786375 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786451 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-logs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786468 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-scripts\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786566 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786603 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-config-data\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.786628 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.787036 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.888570 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.888748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.889308 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-config-data\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.889515 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.889790 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.890114 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznvd\" (UniqueName: \"kubernetes.io/projected/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-kube-api-access-zznvd\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.890268 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.890423 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-logs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.890593 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.890743 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-scripts\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.890982 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-logs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.895730 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.896064 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-scripts\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.898880 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.899862 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-config-data\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.900671 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.904655 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.907661 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznvd\" (UniqueName: \"kubernetes.io/projected/0ad8d85d-0bac-4894-91c9-ad9cd6d485ad-kube-api-access-zznvd\") pod \"cinder-api-0\" (UID: \"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad\") " pod="openstack/cinder-api-0" Oct 01 11:46:52 crc kubenswrapper[4669]: I1001 11:46:52.999145 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 11:46:53 crc kubenswrapper[4669]: I1001 11:46:53.500554 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 11:46:53 crc kubenswrapper[4669]: I1001 11:46:53.542987 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad","Type":"ContainerStarted","Data":"02024a5906865b93cd2037a13bc6a9b23038bc4a5a43f6d1bfe58a25467fc11e"} Oct 01 11:46:53 crc kubenswrapper[4669]: I1001 11:46:53.668829 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f86837-fdf6-41f7-be99-96311790f5de" path="/var/lib/kubelet/pods/a0f86837-fdf6-41f7-be99-96311790f5de/volumes" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.023176 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.025256 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.032482 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p7vtj" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.032984 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.033226 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.094018 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.121291 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68adea0-9ec1-4cc3-a727-a64457a70c9b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.121354 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d68adea0-9ec1-4cc3-a727-a64457a70c9b-openstack-config\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.121535 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xmz\" (UniqueName: \"kubernetes.io/projected/d68adea0-9ec1-4cc3-a727-a64457a70c9b-kube-api-access-b7xmz\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.121628 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d68adea0-9ec1-4cc3-a727-a64457a70c9b-openstack-config-secret\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.224023 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68adea0-9ec1-4cc3-a727-a64457a70c9b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.224123 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d68adea0-9ec1-4cc3-a727-a64457a70c9b-openstack-config\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.224169 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xmz\" (UniqueName: \"kubernetes.io/projected/d68adea0-9ec1-4cc3-a727-a64457a70c9b-kube-api-access-b7xmz\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.224193 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d68adea0-9ec1-4cc3-a727-a64457a70c9b-openstack-config-secret\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.227048 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d68adea0-9ec1-4cc3-a727-a64457a70c9b-openstack-config\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.232760 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68adea0-9ec1-4cc3-a727-a64457a70c9b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.245293 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7xmz\" (UniqueName: \"kubernetes.io/projected/d68adea0-9ec1-4cc3-a727-a64457a70c9b-kube-api-access-b7xmz\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.256445 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d68adea0-9ec1-4cc3-a727-a64457a70c9b-openstack-config-secret\") pod \"openstackclient\" (UID: \"d68adea0-9ec1-4cc3-a727-a64457a70c9b\") " pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.361038 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-866c85f5d8-mvd64" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.361233 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.408774 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.567315 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad","Type":"ContainerStarted","Data":"823e80d0d6c221cb90868a82314fe308dcc385c8d013eac171cf06dc53cab5ac"} Oct 01 11:46:54 crc kubenswrapper[4669]: I1001 11:46:54.902522 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 11:46:55 crc kubenswrapper[4669]: I1001 11:46:55.582274 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d68adea0-9ec1-4cc3-a727-a64457a70c9b","Type":"ContainerStarted","Data":"1f65b5bc1107ac67edfd77baed168088b182fc50c3613c39b6de8363d291e45b"} Oct 01 11:46:55 crc kubenswrapper[4669]: I1001 11:46:55.584594 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ad8d85d-0bac-4894-91c9-ad9cd6d485ad","Type":"ContainerStarted","Data":"83d0971c5ec5028fba0b8ff2a34c63b7c15e38154c9b21733ccbd2d66883a8aa"} Oct 01 11:46:55 crc kubenswrapper[4669]: I1001 11:46:55.584815 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 11:46:55 crc kubenswrapper[4669]: I1001 11:46:55.610815 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.610782479 podStartE2EDuration="3.610782479s" podCreationTimestamp="2025-10-01 11:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:46:55.606805401 +0000 UTC m=+1106.706370388" watchObservedRunningTime="2025-10-01 11:46:55.610782479 +0000 UTC m=+1106.710347456" Oct 01 11:46:56 crc kubenswrapper[4669]: I1001 11:46:56.993278 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.103756 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rxmmm"] Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.104253 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerName="dnsmasq-dns" containerID="cri-o://3f2b1a344e4cd849a66e7f8f80ab83bf8881c32632fe6c57cbe158548e0aeaf4" gracePeriod=10 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.297816 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.348287 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.611115 4669 generic.go:334] "Generic (PLEG): container finished" podID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerID="3f2b1a344e4cd849a66e7f8f80ab83bf8881c32632fe6c57cbe158548e0aeaf4" exitCode=0 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.611623 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="cinder-scheduler" containerID="cri-o://b1cd9f0e242b16755a3f6a3ea4dc0fc22066c190f2bdac9d7dee651f0bafd1bb" gracePeriod=30 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.612018 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" event={"ID":"e830daa9-ce44-48e3-8a0e-61a51a58b2b2","Type":"ContainerDied","Data":"3f2b1a344e4cd849a66e7f8f80ab83bf8881c32632fe6c57cbe158548e0aeaf4"} Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.612060 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" event={"ID":"e830daa9-ce44-48e3-8a0e-61a51a58b2b2","Type":"ContainerDied","Data":"bd4223d7730a3ed11b849bc9f50b65250e0309027e2e72c4d9db3c2ae7410b7f"} Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.612091 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4223d7730a3ed11b849bc9f50b65250e0309027e2e72c4d9db3c2ae7410b7f" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.612423 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="probe" containerID="cri-o://e554f9b57f604085cea48b13e2fda2c8e41278e1a63c4585dc28991f1411876e" gracePeriod=30 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.689008 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.805957 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-svc\") pod \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.806006 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnlhk\" (UniqueName: \"kubernetes.io/projected/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-kube-api-access-jnlhk\") pod \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.806165 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-nb\") pod \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.806203 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-swift-storage-0\") pod \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.806302 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-sb\") pod \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.806459 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-config\") pod \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\" (UID: \"e830daa9-ce44-48e3-8a0e-61a51a58b2b2\") " Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.815777 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-kube-api-access-jnlhk" (OuterVolumeSpecName: "kube-api-access-jnlhk") pod "e830daa9-ce44-48e3-8a0e-61a51a58b2b2" (UID: "e830daa9-ce44-48e3-8a0e-61a51a58b2b2"). InnerVolumeSpecName "kube-api-access-jnlhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.836918 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.837702 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-central-agent" containerID="cri-o://3f8f1272aef600a2c67d9d6be60e69f39c589a20ced292505152864eb7e3450b" gracePeriod=30 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.843146 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="sg-core" containerID="cri-o://f14f59a06358c14e46fee0297a491f0970c32a227d14e4c5b4d5e78d81d82f19" gracePeriod=30 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.843353 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="proxy-httpd" containerID="cri-o://95b8385e616503734b5a9cc1a953a6e77cf347a538e902767ea66fb96fc4b63d" gracePeriod=30 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.843425 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-notification-agent" containerID="cri-o://5131c7251e0298e73104d363ae17a23fe62dbfe24b5c66aa6f9dc78a313b01bb" gracePeriod=30 Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.870178 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.912810 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnlhk\" (UniqueName: \"kubernetes.io/projected/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-kube-api-access-jnlhk\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.919723 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e830daa9-ce44-48e3-8a0e-61a51a58b2b2" (UID: "e830daa9-ce44-48e3-8a0e-61a51a58b2b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.927691 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e830daa9-ce44-48e3-8a0e-61a51a58b2b2" (UID: "e830daa9-ce44-48e3-8a0e-61a51a58b2b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.953950 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e830daa9-ce44-48e3-8a0e-61a51a58b2b2" (UID: "e830daa9-ce44-48e3-8a0e-61a51a58b2b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.965918 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-config" (OuterVolumeSpecName: "config") pod "e830daa9-ce44-48e3-8a0e-61a51a58b2b2" (UID: "e830daa9-ce44-48e3-8a0e-61a51a58b2b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:57 crc kubenswrapper[4669]: I1001 11:46:57.974322 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e830daa9-ce44-48e3-8a0e-61a51a58b2b2" (UID: "e830daa9-ce44-48e3-8a0e-61a51a58b2b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.017843 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.017899 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.017910 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.017920 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.017933 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e830daa9-ce44-48e3-8a0e-61a51a58b2b2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.624462 4669 generic.go:334] "Generic (PLEG): container finished" podID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerID="e554f9b57f604085cea48b13e2fda2c8e41278e1a63c4585dc28991f1411876e" exitCode=0 Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.624554 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"59d42f69-5e5d-498b-a47b-c2c035bb3cf4","Type":"ContainerDied","Data":"e554f9b57f604085cea48b13e2fda2c8e41278e1a63c4585dc28991f1411876e"} Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628685 4669 generic.go:334] "Generic (PLEG): container finished" podID="070d0729-602a-4401-ad53-e721f87a447c" containerID="95b8385e616503734b5a9cc1a953a6e77cf347a538e902767ea66fb96fc4b63d" exitCode=0 Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628729 4669 generic.go:334] "Generic (PLEG): container finished" podID="070d0729-602a-4401-ad53-e721f87a447c" containerID="f14f59a06358c14e46fee0297a491f0970c32a227d14e4c5b4d5e78d81d82f19" exitCode=2 Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628740 4669 generic.go:334] "Generic (PLEG): container finished" podID="070d0729-602a-4401-ad53-e721f87a447c" containerID="3f8f1272aef600a2c67d9d6be60e69f39c589a20ced292505152864eb7e3450b" exitCode=0 Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628738 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerDied","Data":"95b8385e616503734b5a9cc1a953a6e77cf347a538e902767ea66fb96fc4b63d"} Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628821 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerDied","Data":"f14f59a06358c14e46fee0297a491f0970c32a227d14e4c5b4d5e78d81d82f19"} Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628835 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerDied","Data":"3f8f1272aef600a2c67d9d6be60e69f39c589a20ced292505152864eb7e3450b"} Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.628858 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-rxmmm" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.664432 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rxmmm"] Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.674551 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-rxmmm"] Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.782987 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c769b8b9-5svbp"] Oct 01 11:46:58 crc kubenswrapper[4669]: E1001 11:46:58.783491 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerName="init" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.783510 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerName="init" Oct 01 11:46:58 crc kubenswrapper[4669]: E1001 11:46:58.783537 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerName="dnsmasq-dns" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.783544 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerName="dnsmasq-dns" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.783726 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" containerName="dnsmasq-dns" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.794146 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.796559 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.798524 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.798822 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.823184 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c769b8b9-5svbp"] Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833354 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-internal-tls-certs\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833481 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lq6t\" (UniqueName: \"kubernetes.io/projected/fd677364-3064-4b42-9555-b640561fa4ed-kube-api-access-6lq6t\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833509 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd677364-3064-4b42-9555-b640561fa4ed-log-httpd\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833549 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-public-tls-certs\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833569 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-config-data\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833601 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd677364-3064-4b42-9555-b640561fa4ed-etc-swift\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833621 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd677364-3064-4b42-9555-b640561fa4ed-run-httpd\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.833644 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-combined-ca-bundle\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935580 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd677364-3064-4b42-9555-b640561fa4ed-etc-swift\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935650 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd677364-3064-4b42-9555-b640561fa4ed-run-httpd\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935716 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-combined-ca-bundle\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935802 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-internal-tls-certs\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935912 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lq6t\" (UniqueName: \"kubernetes.io/projected/fd677364-3064-4b42-9555-b640561fa4ed-kube-api-access-6lq6t\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935935 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd677364-3064-4b42-9555-b640561fa4ed-log-httpd\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.935987 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-public-tls-certs\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.936008 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-config-data\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.939933 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd677364-3064-4b42-9555-b640561fa4ed-run-httpd\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.941536 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd677364-3064-4b42-9555-b640561fa4ed-log-httpd\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.948908 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-internal-tls-certs\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.959340 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd677364-3064-4b42-9555-b640561fa4ed-etc-swift\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.963363 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-config-data\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.975016 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-combined-ca-bundle\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.982310 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd677364-3064-4b42-9555-b640561fa4ed-public-tls-certs\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:58 crc kubenswrapper[4669]: I1001 11:46:58.986046 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lq6t\" (UniqueName: \"kubernetes.io/projected/fd677364-3064-4b42-9555-b640561fa4ed-kube-api-access-6lq6t\") pod \"swift-proxy-6c769b8b9-5svbp\" (UID: \"fd677364-3064-4b42-9555-b640561fa4ed\") " pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.152493 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:46:59 crc kubenswrapper[4669]: E1001 11:46:59.436708 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62dab5a8_a8e3_4496_8187_089069b8e14f.slice/crio-conmon-6836b5a3a12e44f349fe24052dfc7816b67cc72171ee1b8dc050d26ad2b5f3bc.scope\": RecentStats: unable to find data in memory cache]" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.645617 4669 generic.go:334] "Generic (PLEG): container finished" podID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerID="6836b5a3a12e44f349fe24052dfc7816b67cc72171ee1b8dc050d26ad2b5f3bc" exitCode=137 Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.664817 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e830daa9-ce44-48e3-8a0e-61a51a58b2b2" path="/var/lib/kubelet/pods/e830daa9-ce44-48e3-8a0e-61a51a58b2b2/volumes" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.666808 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c85f5d8-mvd64" event={"ID":"62dab5a8-a8e3-4496-8187-089069b8e14f","Type":"ContainerDied","Data":"6836b5a3a12e44f349fe24052dfc7816b67cc72171ee1b8dc050d26ad2b5f3bc"} Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.787472 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.803262 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c769b8b9-5svbp"] Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.868510 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-config-data\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.870018 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2bc7\" (UniqueName: \"kubernetes.io/projected/62dab5a8-a8e3-4496-8187-089069b8e14f-kube-api-access-r2bc7\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.870072 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dab5a8-a8e3-4496-8187-089069b8e14f-logs\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.870131 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-tls-certs\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.870186 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-combined-ca-bundle\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.870278 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-secret-key\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.870323 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-scripts\") pod \"62dab5a8-a8e3-4496-8187-089069b8e14f\" (UID: \"62dab5a8-a8e3-4496-8187-089069b8e14f\") " Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.872226 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62dab5a8-a8e3-4496-8187-089069b8e14f-logs" (OuterVolumeSpecName: "logs") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.887480 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62dab5a8-a8e3-4496-8187-089069b8e14f-kube-api-access-r2bc7" (OuterVolumeSpecName: "kube-api-access-r2bc7") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "kube-api-access-r2bc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.900549 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.911367 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-config-data" (OuterVolumeSpecName: "config-data") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.917634 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-scripts" (OuterVolumeSpecName: "scripts") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.921319 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.953362 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "62dab5a8-a8e3-4496-8187-089069b8e14f" (UID: "62dab5a8-a8e3-4496-8187-089069b8e14f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982621 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982674 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2bc7\" (UniqueName: \"kubernetes.io/projected/62dab5a8-a8e3-4496-8187-089069b8e14f-kube-api-access-r2bc7\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982695 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dab5a8-a8e3-4496-8187-089069b8e14f-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982703 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982713 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982721 4669 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62dab5a8-a8e3-4496-8187-089069b8e14f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:46:59 crc kubenswrapper[4669]: I1001 11:46:59.982729 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62dab5a8-a8e3-4496-8187-089069b8e14f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.315964 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.661298 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c769b8b9-5svbp" event={"ID":"fd677364-3064-4b42-9555-b640561fa4ed","Type":"ContainerStarted","Data":"4eb53fa87370104a9e226d6b1d7af7bfb8494486b047b8100943c83f4e1ef651"} Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.661712 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c769b8b9-5svbp" event={"ID":"fd677364-3064-4b42-9555-b640561fa4ed","Type":"ContainerStarted","Data":"5088909f56e2fac21e5975003853eedf154f12ff6f0dc68d80f8071c44fdf27d"} Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.661729 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c769b8b9-5svbp" event={"ID":"fd677364-3064-4b42-9555-b640561fa4ed","Type":"ContainerStarted","Data":"822dcc3279c65b6b32c9266cd331daf96645876c8f1ba1e4594bb90b4c1456d8"} Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.661748 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.661762 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.663646 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c85f5d8-mvd64" event={"ID":"62dab5a8-a8e3-4496-8187-089069b8e14f","Type":"ContainerDied","Data":"7a9a633ab1c7f42ed47d2997d9dc885d00311674cc27bac51f072d45c539d89b"} Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.663699 4669 scope.go:117] "RemoveContainer" containerID="e291c34c35b2e8ea3b830f371585d97db07df75e845acb5850aa9ed5690727d9" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.663914 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c85f5d8-mvd64" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.689829 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c769b8b9-5svbp" podStartSLOduration=2.689798052 podStartE2EDuration="2.689798052s" podCreationTimestamp="2025-10-01 11:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:47:00.680126326 +0000 UTC m=+1111.779691313" watchObservedRunningTime="2025-10-01 11:47:00.689798052 +0000 UTC m=+1111.789363029" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.723604 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-866c85f5d8-mvd64"] Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.744586 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-866c85f5d8-mvd64"] Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.885476 4669 scope.go:117] "RemoveContainer" containerID="6836b5a3a12e44f349fe24052dfc7816b67cc72171ee1b8dc050d26ad2b5f3bc" Oct 01 11:47:00 crc kubenswrapper[4669]: I1001 11:47:00.977338 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:47:01 crc kubenswrapper[4669]: I1001 11:47:01.200928 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-795f7c5588-ppc46" Oct 01 11:47:01 crc kubenswrapper[4669]: I1001 11:47:01.669212 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" path="/var/lib/kubelet/pods/62dab5a8-a8e3-4496-8187-089069b8e14f/volumes" Oct 01 11:47:02 crc kubenswrapper[4669]: I1001 11:47:02.687658 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75fdb4d7c7-7ltfb" Oct 01 11:47:02 crc kubenswrapper[4669]: I1001 11:47:02.696597 4669 generic.go:334] "Generic (PLEG): container finished" podID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerID="b1cd9f0e242b16755a3f6a3ea4dc0fc22066c190f2bdac9d7dee651f0bafd1bb" exitCode=0 Oct 01 11:47:02 crc kubenswrapper[4669]: I1001 11:47:02.696675 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"59d42f69-5e5d-498b-a47b-c2c035bb3cf4","Type":"ContainerDied","Data":"b1cd9f0e242b16755a3f6a3ea4dc0fc22066c190f2bdac9d7dee651f0bafd1bb"} Oct 01 11:47:02 crc kubenswrapper[4669]: I1001 11:47:02.787477 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fbb698fb8-vwrw5"] Oct 01 11:47:02 crc kubenswrapper[4669]: I1001 11:47:02.787747 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fbb698fb8-vwrw5" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-api" containerID="cri-o://9c24df43128d70950e52a0798d7495261e93ff9bcb648818e9e80f49bb11938d" gracePeriod=30 Oct 01 11:47:02 crc kubenswrapper[4669]: I1001 11:47:02.787918 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fbb698fb8-vwrw5" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-httpd" containerID="cri-o://1b00fbdad665789e469be97345430a77ebd335bc428a8755e3634ea5c13ee394" gracePeriod=30 Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.728959 4669 generic.go:334] "Generic (PLEG): container finished" podID="070d0729-602a-4401-ad53-e721f87a447c" containerID="5131c7251e0298e73104d363ae17a23fe62dbfe24b5c66aa6f9dc78a313b01bb" exitCode=0 Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.729050 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerDied","Data":"5131c7251e0298e73104d363ae17a23fe62dbfe24b5c66aa6f9dc78a313b01bb"} Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.742606 4669 generic.go:334] "Generic (PLEG): container finished" podID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerID="1b00fbdad665789e469be97345430a77ebd335bc428a8755e3634ea5c13ee394" exitCode=0 Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.742665 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbb698fb8-vwrw5" event={"ID":"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598","Type":"ContainerDied","Data":"1b00fbdad665789e469be97345430a77ebd335bc428a8755e3634ea5c13ee394"} Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.830163 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8tkmw"] Oct 01 11:47:03 crc kubenswrapper[4669]: E1001 11:47:03.831023 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.831120 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" Oct 01 11:47:03 crc kubenswrapper[4669]: E1001 11:47:03.831221 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon-log" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.831285 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon-log" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.831567 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.831653 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dab5a8-a8e3-4496-8187-089069b8e14f" containerName="horizon-log" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.832572 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.872323 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8tkmw"] Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.914468 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzgf\" (UniqueName: \"kubernetes.io/projected/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0-kube-api-access-lmzgf\") pod \"nova-api-db-create-8tkmw\" (UID: \"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0\") " pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.924135 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-x5w8f"] Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.925585 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:03 crc kubenswrapper[4669]: I1001 11:47:03.956729 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x5w8f"] Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.018212 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzgf\" (UniqueName: \"kubernetes.io/projected/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0-kube-api-access-lmzgf\") pod \"nova-api-db-create-8tkmw\" (UID: \"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0\") " pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.018374 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdngg\" (UniqueName: \"kubernetes.io/projected/7dbbc207-4a1d-40c3-8392-ebfc5def670a-kube-api-access-fdngg\") pod \"nova-cell0-db-create-x5w8f\" (UID: \"7dbbc207-4a1d-40c3-8392-ebfc5def670a\") " pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.025042 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-b72rx"] Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.028755 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.048550 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-b72rx"] Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.063420 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzgf\" (UniqueName: \"kubernetes.io/projected/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0-kube-api-access-lmzgf\") pod \"nova-api-db-create-8tkmw\" (UID: \"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0\") " pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.120899 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbnn\" (UniqueName: \"kubernetes.io/projected/ebda0307-643b-4933-aea3-a3ea9b534f50-kube-api-access-cmbnn\") pod \"nova-cell1-db-create-b72rx\" (UID: \"ebda0307-643b-4933-aea3-a3ea9b534f50\") " pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.121026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdngg\" (UniqueName: \"kubernetes.io/projected/7dbbc207-4a1d-40c3-8392-ebfc5def670a-kube-api-access-fdngg\") pod \"nova-cell0-db-create-x5w8f\" (UID: \"7dbbc207-4a1d-40c3-8392-ebfc5def670a\") " pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.138518 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdngg\" (UniqueName: \"kubernetes.io/projected/7dbbc207-4a1d-40c3-8392-ebfc5def670a-kube-api-access-fdngg\") pod \"nova-cell0-db-create-x5w8f\" (UID: \"7dbbc207-4a1d-40c3-8392-ebfc5def670a\") " pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.188545 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.223489 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbnn\" (UniqueName: \"kubernetes.io/projected/ebda0307-643b-4933-aea3-a3ea9b534f50-kube-api-access-cmbnn\") pod \"nova-cell1-db-create-b72rx\" (UID: \"ebda0307-643b-4933-aea3-a3ea9b534f50\") " pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.243647 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbnn\" (UniqueName: \"kubernetes.io/projected/ebda0307-643b-4933-aea3-a3ea9b534f50-kube-api-access-cmbnn\") pod \"nova-cell1-db-create-b72rx\" (UID: \"ebda0307-643b-4933-aea3-a3ea9b534f50\") " pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.250407 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:04 crc kubenswrapper[4669]: I1001 11:47:04.347851 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.208027 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.208694 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-log" containerID="cri-o://8b2221caba3f5623b222afb7931d94552eafe6fabfc0de0b300e59c9971a63d8" gracePeriod=30 Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.208861 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-httpd" containerID="cri-o://4a46420043020db08e7e7eda689a74f780f828af0ee6c273152e621e55ad694e" gracePeriod=30 Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.396174 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.434008 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": dial tcp 10.217.0.162:3000: connect: connection refused" Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.780667 4669 generic.go:334] "Generic (PLEG): container finished" podID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerID="8b2221caba3f5623b222afb7931d94552eafe6fabfc0de0b300e59c9971a63d8" exitCode=143 Oct 01 11:47:05 crc kubenswrapper[4669]: I1001 11:47:05.780765 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e","Type":"ContainerDied","Data":"8b2221caba3f5623b222afb7931d94552eafe6fabfc0de0b300e59c9971a63d8"} Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.207925 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.208302 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-log" containerID="cri-o://3ff0027a5d4bf893f4821b994e546cad17f5cfdaaf0fa568764fcf16ddf16d3f" gracePeriod=30 Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.208390 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-httpd" containerID="cri-o://b5eacb4ec5ca11175d7373e4c1b4fe7356249a8933437d7c1394c5af780dabc0" gracePeriod=30 Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.218059 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.161:9292/healthcheck\": EOF" Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.218055 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9292/healthcheck\": EOF" Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.218245 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.161:9292/healthcheck\": EOF" Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.218297 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9292/healthcheck\": EOF" Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.796093 4669 generic.go:334] "Generic (PLEG): container finished" podID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerID="3ff0027a5d4bf893f4821b994e546cad17f5cfdaaf0fa568764fcf16ddf16d3f" exitCode=143 Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.796143 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3e7cb3e-aea7-4369-91a7-ccdaf2531415","Type":"ContainerDied","Data":"3ff0027a5d4bf893f4821b994e546cad17f5cfdaaf0fa568764fcf16ddf16d3f"} Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.801878 4669 generic.go:334] "Generic (PLEG): container finished" podID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerID="9c24df43128d70950e52a0798d7495261e93ff9bcb648818e9e80f49bb11938d" exitCode=0 Oct 01 11:47:06 crc kubenswrapper[4669]: I1001 11:47:06.801947 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbb698fb8-vwrw5" event={"ID":"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598","Type":"ContainerDied","Data":"9c24df43128d70950e52a0798d7495261e93ff9bcb648818e9e80f49bb11938d"} Oct 01 11:47:08 crc kubenswrapper[4669]: I1001 11:47:08.360149 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.163:9292/healthcheck\": read tcp 10.217.0.2:59830->10.217.0.163:9292: read: connection reset by peer" Oct 01 11:47:08 crc kubenswrapper[4669]: I1001 11:47:08.360325 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9292/healthcheck\": read tcp 10.217.0.2:59818->10.217.0.163:9292: read: connection reset by peer" Oct 01 11:47:08 crc kubenswrapper[4669]: I1001 11:47:08.833096 4669 generic.go:334] "Generic (PLEG): container finished" podID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerID="4a46420043020db08e7e7eda689a74f780f828af0ee6c273152e621e55ad694e" exitCode=0 Oct 01 11:47:08 crc kubenswrapper[4669]: I1001 11:47:08.833297 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e","Type":"ContainerDied","Data":"4a46420043020db08e7e7eda689a74f780f828af0ee6c273152e621e55ad694e"} Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.162377 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.167660 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c769b8b9-5svbp" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.345366 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.474228 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.479307 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-scripts\") pod \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.479414 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data-custom\") pod \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.479447 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data\") pod \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.479527 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-combined-ca-bundle\") pod \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.480093 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbds\" (UniqueName: \"kubernetes.io/projected/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-kube-api-access-2xbds\") pod \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.480248 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-etc-machine-id\") pod \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\" (UID: \"59d42f69-5e5d-498b-a47b-c2c035bb3cf4\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.480730 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "59d42f69-5e5d-498b-a47b-c2c035bb3cf4" (UID: "59d42f69-5e5d-498b-a47b-c2c035bb3cf4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.493462 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-kube-api-access-2xbds" (OuterVolumeSpecName: "kube-api-access-2xbds") pod "59d42f69-5e5d-498b-a47b-c2c035bb3cf4" (UID: "59d42f69-5e5d-498b-a47b-c2c035bb3cf4"). InnerVolumeSpecName "kube-api-access-2xbds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.493567 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-scripts" (OuterVolumeSpecName: "scripts") pod "59d42f69-5e5d-498b-a47b-c2c035bb3cf4" (UID: "59d42f69-5e5d-498b-a47b-c2c035bb3cf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.493660 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59d42f69-5e5d-498b-a47b-c2c035bb3cf4" (UID: "59d42f69-5e5d-498b-a47b-c2c035bb3cf4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.551318 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59d42f69-5e5d-498b-a47b-c2c035bb3cf4" (UID: "59d42f69-5e5d-498b-a47b-c2c035bb3cf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.582788 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-scripts\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.582931 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-log-httpd\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.582962 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94xrk\" (UniqueName: \"kubernetes.io/projected/070d0729-602a-4401-ad53-e721f87a447c-kube-api-access-94xrk\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583118 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-config-data\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583203 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-sg-core-conf-yaml\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583223 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-combined-ca-bundle\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583391 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-run-httpd\") pod \"070d0729-602a-4401-ad53-e721f87a447c\" (UID: \"070d0729-602a-4401-ad53-e721f87a447c\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583882 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583907 4669 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583918 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583931 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbds\" (UniqueName: \"kubernetes.io/projected/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-kube-api-access-2xbds\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.583942 4669 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.584450 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.584924 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.595766 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/070d0729-602a-4401-ad53-e721f87a447c-kube-api-access-94xrk" (OuterVolumeSpecName: "kube-api-access-94xrk") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "kube-api-access-94xrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.595918 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-scripts" (OuterVolumeSpecName: "scripts") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.628299 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.667116 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data" (OuterVolumeSpecName: "config-data") pod "59d42f69-5e5d-498b-a47b-c2c035bb3cf4" (UID: "59d42f69-5e5d-498b-a47b-c2c035bb3cf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.685920 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.685951 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59d42f69-5e5d-498b-a47b-c2c035bb3cf4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.685961 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.685972 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.685982 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/070d0729-602a-4401-ad53-e721f87a447c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.685993 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94xrk\" (UniqueName: \"kubernetes.io/projected/070d0729-602a-4401-ad53-e721f87a447c-kube-api-access-94xrk\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.758904 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.791310 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.791885 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.825350 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-config-data" (OuterVolumeSpecName: "config-data") pod "070d0729-602a-4401-ad53-e721f87a447c" (UID: "070d0729-602a-4401-ad53-e721f87a447c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.876522 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d68adea0-9ec1-4cc3-a727-a64457a70c9b","Type":"ContainerStarted","Data":"f646d1d818d04b43aad1cbf6f2c36d7a3bb1659eed350f132c32d41c9d9689d3"} Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.893856 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-config\") pod \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.893994 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfbv5\" (UniqueName: \"kubernetes.io/projected/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-kube-api-access-tfbv5\") pod \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.894027 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-httpd-config\") pod \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.894059 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-ovndb-tls-certs\") pod \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.894331 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-combined-ca-bundle\") pod \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\" (UID: \"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598\") " Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.894753 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/070d0729-602a-4401-ad53-e721f87a447c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.900854 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-kube-api-access-tfbv5" (OuterVolumeSpecName: "kube-api-access-tfbv5") pod "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" (UID: "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598"). InnerVolumeSpecName "kube-api-access-tfbv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.950611 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" (UID: "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.959246 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fbb698fb8-vwrw5" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.960672 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbb698fb8-vwrw5" event={"ID":"a701bc8f-fd3f-43ed-9b7a-bbb3696dc598","Type":"ContainerDied","Data":"f234e6191799de49d8c81fcd839a29f3d69eedbbe274c05da476b574e5fdcda3"} Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.960752 4669 scope.go:117] "RemoveContainer" containerID="1b00fbdad665789e469be97345430a77ebd335bc428a8755e3634ea5c13ee394" Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.981233 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"59d42f69-5e5d-498b-a47b-c2c035bb3cf4","Type":"ContainerDied","Data":"837c9373764c53d6f10eb59b941e1139b91877422c1bf78627eb0217c4700eb9"} Oct 01 11:47:09 crc kubenswrapper[4669]: I1001 11:47:09.981617 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.013596 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.013655 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"070d0729-602a-4401-ad53-e721f87a447c","Type":"ContainerDied","Data":"22d05982856866330e43adca8f4e4fb5debf89033bcad62ce54c6a22ea76e110"} Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.025501 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfbv5\" (UniqueName: \"kubernetes.io/projected/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-kube-api-access-tfbv5\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.030513 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.037308 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-config" (OuterVolumeSpecName: "config") pod "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" (UID: "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.125267 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8tkmw"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.126464 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.797153745 podStartE2EDuration="17.126443919s" podCreationTimestamp="2025-10-01 11:46:53 +0000 UTC" firstStartedPulling="2025-10-01 11:46:54.916166782 +0000 UTC m=+1106.015731759" lastFinishedPulling="2025-10-01 11:47:09.245456956 +0000 UTC m=+1120.345021933" observedRunningTime="2025-10-01 11:47:09.951188024 +0000 UTC m=+1121.050753011" watchObservedRunningTime="2025-10-01 11:47:10.126443919 +0000 UTC m=+1121.226008896" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.132875 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.134742 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x5w8f"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.139409 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" (UID: "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.142316 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-b72rx"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.150027 4669 scope.go:117] "RemoveContainer" containerID="9c24df43128d70950e52a0798d7495261e93ff9bcb648818e9e80f49bb11938d" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.210881 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.222053 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.235333 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.247282 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.250039 4669 scope.go:117] "RemoveContainer" containerID="e554f9b57f604085cea48b13e2fda2c8e41278e1a63c4585dc28991f1411876e" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.279210 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.284538 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.295519 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.296372 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="proxy-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.296454 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="proxy-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.296554 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-log" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.296612 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-log" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.296671 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="probe" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.296722 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="probe" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.296781 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="cinder-scheduler" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.296838 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="cinder-scheduler" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.296897 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.296968 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.297427 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.297491 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.297543 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-notification-agent" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.297722 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-notification-agent" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.297845 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-central-agent" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.297914 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-central-agent" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.297994 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="sg-core" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298053 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="sg-core" Oct 01 11:47:10 crc kubenswrapper[4669]: E1001 11:47:10.298151 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-api" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298224 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-api" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298536 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="cinder-scheduler" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298620 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298677 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-central-agent" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298750 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="ceilometer-notification-agent" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298812 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="sg-core" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298871 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-api" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.298942 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" containerName="probe" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.299006 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" containerName="neutron-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.299063 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="070d0729-602a-4401-ad53-e721f87a447c" containerName="proxy-httpd" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.299335 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" containerName="glance-log" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.300365 4669 scope.go:117] "RemoveContainer" containerID="b1cd9f0e242b16755a3f6a3ea4dc0fc22066c190f2bdac9d7dee651f0bafd1bb" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.300869 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.307784 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.308970 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" (UID: "a701bc8f-fd3f-43ed-9b7a-bbb3696dc598"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.321766 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.330609 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.337527 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338486 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-httpd-run\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338551 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-scripts\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338605 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-config-data\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338778 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-combined-ca-bundle\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338821 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-logs\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338849 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-internal-tls-certs\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.338912 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j5rz\" (UniqueName: \"kubernetes.io/projected/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-kube-api-access-2j5rz\") pod \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\" (UID: \"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e\") " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339184 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339254 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dc50b83-702d-4bf7-bee7-87ead33a1faa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339310 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339362 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njnkg\" (UniqueName: \"kubernetes.io/projected/7dc50b83-702d-4bf7-bee7-87ead33a1faa-kube-api-access-njnkg\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339400 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339429 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.339491 4669 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.340708 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.340948 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.342060 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-logs" (OuterVolumeSpecName: "logs") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.347646 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.348732 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-scripts" (OuterVolumeSpecName: "scripts") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.351982 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.352473 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.353664 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-kube-api-access-2j5rz" (OuterVolumeSpecName: "kube-api-access-2j5rz") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "kube-api-access-2j5rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.403496 4669 scope.go:117] "RemoveContainer" containerID="95b8385e616503734b5a9cc1a953a6e77cf347a538e902767ea66fb96fc4b63d" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.418219 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.424684 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.438354 4669 scope.go:117] "RemoveContainer" containerID="f14f59a06358c14e46fee0297a491f0970c32a227d14e4c5b4d5e78d81d82f19" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.440942 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dc50b83-702d-4bf7-bee7-87ead33a1faa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441002 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-log-httpd\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441072 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441131 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441159 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-config-data\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441182 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-scripts\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441213 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njnkg\" (UniqueName: \"kubernetes.io/projected/7dc50b83-702d-4bf7-bee7-87ead33a1faa-kube-api-access-njnkg\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441239 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441286 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441306 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-run-httpd\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441331 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vcz8\" (UniqueName: \"kubernetes.io/projected/ea585ed6-b11c-485f-87d9-47145a877b8b-kube-api-access-7vcz8\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441356 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441470 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441473 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dc50b83-702d-4bf7-bee7-87ead33a1faa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441523 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441565 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441603 4669 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441618 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j5rz\" (UniqueName: \"kubernetes.io/projected/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-kube-api-access-2j5rz\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441632 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.441642 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.449947 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.453408 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.459920 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.468378 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc50b83-702d-4bf7-bee7-87ead33a1faa-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.472838 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njnkg\" (UniqueName: \"kubernetes.io/projected/7dc50b83-702d-4bf7-bee7-87ead33a1faa-kube-api-access-njnkg\") pod \"cinder-scheduler-0\" (UID: \"7dc50b83-702d-4bf7-bee7-87ead33a1faa\") " pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.511416 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-config-data" (OuterVolumeSpecName: "config-data") pod "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" (UID: "69cf7ef4-5d7b-487b-b6d7-f3b29335c19e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.517343 4669 scope.go:117] "RemoveContainer" containerID="5131c7251e0298e73104d363ae17a23fe62dbfe24b5c66aa6f9dc78a313b01bb" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.517460 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543174 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-scripts\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543357 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543480 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-run-httpd\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543570 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vcz8\" (UniqueName: \"kubernetes.io/projected/ea585ed6-b11c-485f-87d9-47145a877b8b-kube-api-access-7vcz8\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543719 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-log-httpd\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543830 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.543929 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-config-data\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.544057 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.544150 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.546693 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-run-httpd\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.549068 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-config-data\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.549726 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-log-httpd\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.551979 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.558848 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.559461 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-scripts\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.583591 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vcz8\" (UniqueName: \"kubernetes.io/projected/ea585ed6-b11c-485f-87d9-47145a877b8b-kube-api-access-7vcz8\") pod \"ceilometer-0\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " pod="openstack/ceilometer-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.584457 4669 scope.go:117] "RemoveContainer" containerID="3f8f1272aef600a2c67d9d6be60e69f39c589a20ced292505152864eb7e3450b" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.619119 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fbb698fb8-vwrw5"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.628276 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6fbb698fb8-vwrw5"] Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.723783 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 11:47:10 crc kubenswrapper[4669]: I1001 11:47:10.751517 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.033690 4669 generic.go:334] "Generic (PLEG): container finished" podID="7dbbc207-4a1d-40c3-8392-ebfc5def670a" containerID="7cb5359763691dde1e21a6058bd3daddb4c1ba78f9bc344afb0bbddd8356ed6d" exitCode=0 Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.033813 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x5w8f" event={"ID":"7dbbc207-4a1d-40c3-8392-ebfc5def670a","Type":"ContainerDied","Data":"7cb5359763691dde1e21a6058bd3daddb4c1ba78f9bc344afb0bbddd8356ed6d"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.034274 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x5w8f" event={"ID":"7dbbc207-4a1d-40c3-8392-ebfc5def670a","Type":"ContainerStarted","Data":"ebfeef454dfd1ee7c2987e1e07befee6efd6769c54ef511af7f856143c5a2b68"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.037913 4669 generic.go:334] "Generic (PLEG): container finished" podID="f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0" containerID="6c758ca5869845c8abc0c014a267fcfc25f4243a9ec74a2860849eb6ccea90e0" exitCode=0 Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.037994 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8tkmw" event={"ID":"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0","Type":"ContainerDied","Data":"6c758ca5869845c8abc0c014a267fcfc25f4243a9ec74a2860849eb6ccea90e0"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.038028 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8tkmw" event={"ID":"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0","Type":"ContainerStarted","Data":"59dc8d6c53085e05251eeece1d54a6ed1cab987be5e9b7f5a1301a3edb1b97cd"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.045131 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.045159 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69cf7ef4-5d7b-487b-b6d7-f3b29335c19e","Type":"ContainerDied","Data":"8d0acaf38053c80e1c53c4f04e51b371236b4aa9b2de2e8e32685bf69e3d19d3"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.045263 4669 scope.go:117] "RemoveContainer" containerID="4a46420043020db08e7e7eda689a74f780f828af0ee6c273152e621e55ad694e" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.081386 4669 generic.go:334] "Generic (PLEG): container finished" podID="ebda0307-643b-4933-aea3-a3ea9b534f50" containerID="e7a3efef00b92e67553b4b954621149b11ce00a8f4ed592ee39909410c19d004" exitCode=0 Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.081729 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-b72rx" event={"ID":"ebda0307-643b-4933-aea3-a3ea9b534f50","Type":"ContainerDied","Data":"e7a3efef00b92e67553b4b954621149b11ce00a8f4ed592ee39909410c19d004"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.081787 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-b72rx" event={"ID":"ebda0307-643b-4933-aea3-a3ea9b534f50","Type":"ContainerStarted","Data":"ca03e25ff30959340ee038a58773e55cbcc952ecddd62d0020e03ee235e0bc70"} Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.117453 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.125449 4669 scope.go:117] "RemoveContainer" containerID="8b2221caba3f5623b222afb7931d94552eafe6fabfc0de0b300e59c9971a63d8" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.130814 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.158813 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.161617 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.169147 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.169277 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.169574 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.265823 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.265914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.265965 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.266055 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.266183 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvwll\" (UniqueName: \"kubernetes.io/projected/0712d8cd-5673-4792-bafd-463179234f1d-kube-api-access-wvwll\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.266244 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.266309 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0712d8cd-5673-4792-bafd-463179234f1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.266345 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0712d8cd-5673-4792-bafd-463179234f1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.277328 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368102 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368165 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvwll\" (UniqueName: \"kubernetes.io/projected/0712d8cd-5673-4792-bafd-463179234f1d-kube-api-access-wvwll\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368222 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368274 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0712d8cd-5673-4792-bafd-463179234f1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368304 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0712d8cd-5673-4792-bafd-463179234f1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368354 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368371 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.368402 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.369290 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0712d8cd-5673-4792-bafd-463179234f1d-logs\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.369663 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0712d8cd-5673-4792-bafd-463179234f1d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.370060 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.376870 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.377571 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.378130 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.392933 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvwll\" (UniqueName: \"kubernetes.io/projected/0712d8cd-5673-4792-bafd-463179234f1d-kube-api-access-wvwll\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.404242 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0712d8cd-5673-4792-bafd-463179234f1d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.404484 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: W1001 11:47:11.411682 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc50b83_702d_4bf7_bee7_87ead33a1faa.slice/crio-34a6a82804db1b0598d96434e67161e96d4aa6daa2beebbaa104140475fa25cf WatchSource:0}: Error finding container 34a6a82804db1b0598d96434e67161e96d4aa6daa2beebbaa104140475fa25cf: Status 404 returned error can't find the container with id 34a6a82804db1b0598d96434e67161e96d4aa6daa2beebbaa104140475fa25cf Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.440053 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0712d8cd-5673-4792-bafd-463179234f1d\") " pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.489481 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.492566 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.669213 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="070d0729-602a-4401-ad53-e721f87a447c" path="/var/lib/kubelet/pods/070d0729-602a-4401-ad53-e721f87a447c/volumes" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.671219 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d42f69-5e5d-498b-a47b-c2c035bb3cf4" path="/var/lib/kubelet/pods/59d42f69-5e5d-498b-a47b-c2c035bb3cf4/volumes" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.672412 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cf7ef4-5d7b-487b-b6d7-f3b29335c19e" path="/var/lib/kubelet/pods/69cf7ef4-5d7b-487b-b6d7-f3b29335c19e/volumes" Oct 01 11:47:11 crc kubenswrapper[4669]: I1001 11:47:11.673039 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a701bc8f-fd3f-43ed-9b7a-bbb3696dc598" path="/var/lib/kubelet/pods/a701bc8f-fd3f-43ed-9b7a-bbb3696dc598/volumes" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.113448 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerStarted","Data":"8520cccbf9920e80782d8d92420310c1d834d04576b00eeb5bae551895a2fe07"} Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.135393 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 11:47:12 crc kubenswrapper[4669]: W1001 11:47:12.136480 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0712d8cd_5673_4792_bafd_463179234f1d.slice/crio-9a34167a3f67fc94de6f0331f95a3254b3d9e6e3f2142d849c48cb931f9fb884 WatchSource:0}: Error finding container 9a34167a3f67fc94de6f0331f95a3254b3d9e6e3f2142d849c48cb931f9fb884: Status 404 returned error can't find the container with id 9a34167a3f67fc94de6f0331f95a3254b3d9e6e3f2142d849c48cb931f9fb884 Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.137017 4669 generic.go:334] "Generic (PLEG): container finished" podID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerID="b5eacb4ec5ca11175d7373e4c1b4fe7356249a8933437d7c1394c5af780dabc0" exitCode=0 Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.137194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3e7cb3e-aea7-4369-91a7-ccdaf2531415","Type":"ContainerDied","Data":"b5eacb4ec5ca11175d7373e4c1b4fe7356249a8933437d7c1394c5af780dabc0"} Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.143559 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dc50b83-702d-4bf7-bee7-87ead33a1faa","Type":"ContainerStarted","Data":"34a6a82804db1b0598d96434e67161e96d4aa6daa2beebbaa104140475fa25cf"} Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.299360 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.395327 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-config-data\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396188 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-combined-ca-bundle\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396229 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mhwg\" (UniqueName: \"kubernetes.io/projected/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-kube-api-access-2mhwg\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396266 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-logs\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396324 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-scripts\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396562 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396597 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-public-tls-certs\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.396654 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-httpd-run\") pod \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\" (UID: \"d3e7cb3e-aea7-4369-91a7-ccdaf2531415\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.397560 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.403179 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-kube-api-access-2mhwg" (OuterVolumeSpecName: "kube-api-access-2mhwg") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "kube-api-access-2mhwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.403279 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-scripts" (OuterVolumeSpecName: "scripts") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.403710 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-logs" (OuterVolumeSpecName: "logs") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.416772 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.451569 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.481296 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-config-data" (OuterVolumeSpecName: "config-data") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.481404 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d3e7cb3e-aea7-4369-91a7-ccdaf2531415" (UID: "d3e7cb3e-aea7-4369-91a7-ccdaf2531415"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500409 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500443 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500571 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mhwg\" (UniqueName: \"kubernetes.io/projected/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-kube-api-access-2mhwg\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500584 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500593 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500633 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500646 4669 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.500657 4669 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3e7cb3e-aea7-4369-91a7-ccdaf2531415-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.535903 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.579685 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.602957 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.704168 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdngg\" (UniqueName: \"kubernetes.io/projected/7dbbc207-4a1d-40c3-8392-ebfc5def670a-kube-api-access-fdngg\") pod \"7dbbc207-4a1d-40c3-8392-ebfc5def670a\" (UID: \"7dbbc207-4a1d-40c3-8392-ebfc5def670a\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.717396 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dbbc207-4a1d-40c3-8392-ebfc5def670a-kube-api-access-fdngg" (OuterVolumeSpecName: "kube-api-access-fdngg") pod "7dbbc207-4a1d-40c3-8392-ebfc5def670a" (UID: "7dbbc207-4a1d-40c3-8392-ebfc5def670a"). InnerVolumeSpecName "kube-api-access-fdngg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.752644 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.756006 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.807023 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdngg\" (UniqueName: \"kubernetes.io/projected/7dbbc207-4a1d-40c3-8392-ebfc5def670a-kube-api-access-fdngg\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.911460 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmzgf\" (UniqueName: \"kubernetes.io/projected/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0-kube-api-access-lmzgf\") pod \"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0\" (UID: \"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.911751 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmbnn\" (UniqueName: \"kubernetes.io/projected/ebda0307-643b-4933-aea3-a3ea9b534f50-kube-api-access-cmbnn\") pod \"ebda0307-643b-4933-aea3-a3ea9b534f50\" (UID: \"ebda0307-643b-4933-aea3-a3ea9b534f50\") " Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.934181 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebda0307-643b-4933-aea3-a3ea9b534f50-kube-api-access-cmbnn" (OuterVolumeSpecName: "kube-api-access-cmbnn") pod "ebda0307-643b-4933-aea3-a3ea9b534f50" (UID: "ebda0307-643b-4933-aea3-a3ea9b534f50"). InnerVolumeSpecName "kube-api-access-cmbnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:12 crc kubenswrapper[4669]: I1001 11:47:12.951832 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0-kube-api-access-lmzgf" (OuterVolumeSpecName: "kube-api-access-lmzgf") pod "f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0" (UID: "f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0"). InnerVolumeSpecName "kube-api-access-lmzgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.015174 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmzgf\" (UniqueName: \"kubernetes.io/projected/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0-kube-api-access-lmzgf\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.015208 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmbnn\" (UniqueName: \"kubernetes.io/projected/ebda0307-643b-4933-aea3-a3ea9b534f50-kube-api-access-cmbnn\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.190269 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.190759 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3e7cb3e-aea7-4369-91a7-ccdaf2531415","Type":"ContainerDied","Data":"09151fd19778a6fa820f6cce3c3ea829df696b93b20481ef96467db57af8311c"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.190833 4669 scope.go:117] "RemoveContainer" containerID="b5eacb4ec5ca11175d7373e4c1b4fe7356249a8933437d7c1394c5af780dabc0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.211599 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dc50b83-702d-4bf7-bee7-87ead33a1faa","Type":"ContainerStarted","Data":"3387070c2d582d7edf70f986f74e904ce462e7f59d9a20a1c8e99e0b9e781d73"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.217471 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-b72rx" event={"ID":"ebda0307-643b-4933-aea3-a3ea9b534f50","Type":"ContainerDied","Data":"ca03e25ff30959340ee038a58773e55cbcc952ecddd62d0020e03ee235e0bc70"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.217511 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca03e25ff30959340ee038a58773e55cbcc952ecddd62d0020e03ee235e0bc70" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.217632 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-b72rx" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.234629 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0712d8cd-5673-4792-bafd-463179234f1d","Type":"ContainerStarted","Data":"9a34167a3f67fc94de6f0331f95a3254b3d9e6e3f2142d849c48cb931f9fb884"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.240190 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x5w8f" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.241008 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x5w8f" event={"ID":"7dbbc207-4a1d-40c3-8392-ebfc5def670a","Type":"ContainerDied","Data":"ebfeef454dfd1ee7c2987e1e07befee6efd6769c54ef511af7f856143c5a2b68"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.241061 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebfeef454dfd1ee7c2987e1e07befee6efd6769c54ef511af7f856143c5a2b68" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.247608 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8tkmw" event={"ID":"f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0","Type":"ContainerDied","Data":"59dc8d6c53085e05251eeece1d54a6ed1cab987be5e9b7f5a1301a3edb1b97cd"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.247651 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59dc8d6c53085e05251eeece1d54a6ed1cab987be5e9b7f5a1301a3edb1b97cd" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.247731 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8tkmw" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.256992 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerStarted","Data":"52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710"} Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.355731 4669 scope.go:117] "RemoveContainer" containerID="3ff0027a5d4bf893f4821b994e546cad17f5cfdaaf0fa568764fcf16ddf16d3f" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.411099 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.440964 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.464995 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:47:13 crc kubenswrapper[4669]: E1001 11:47:13.466243 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-log" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466267 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-log" Oct 01 11:47:13 crc kubenswrapper[4669]: E1001 11:47:13.466309 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbbc207-4a1d-40c3-8392-ebfc5def670a" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466318 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbbc207-4a1d-40c3-8392-ebfc5def670a" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: E1001 11:47:13.466347 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebda0307-643b-4933-aea3-a3ea9b534f50" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466355 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebda0307-643b-4933-aea3-a3ea9b534f50" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: E1001 11:47:13.466375 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466382 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: E1001 11:47:13.466404 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-httpd" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466412 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-httpd" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466641 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dbbc207-4a1d-40c3-8392-ebfc5def670a" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466679 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466687 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-httpd" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466700 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" containerName="glance-log" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.466711 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebda0307-643b-4933-aea3-a3ea9b534f50" containerName="mariadb-database-create" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.468389 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.474148 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.475333 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.500286 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.659765 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-logs\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.659853 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.659901 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-scripts\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.659941 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.659974 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvb94\" (UniqueName: \"kubernetes.io/projected/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-kube-api-access-rvb94\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.660041 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.660097 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-config-data\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.660156 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.717703 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e7cb3e-aea7-4369-91a7-ccdaf2531415" path="/var/lib/kubelet/pods/d3e7cb3e-aea7-4369-91a7-ccdaf2531415/volumes" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.761638 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-config-data\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762034 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762156 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-logs\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762188 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762214 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-scripts\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762257 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762281 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvb94\" (UniqueName: \"kubernetes.io/projected/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-kube-api-access-rvb94\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.762335 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.766332 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.766383 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-logs\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.769003 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.769703 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.775369 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.775406 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-config-data\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.778572 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-scripts\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.791693 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvb94\" (UniqueName: \"kubernetes.io/projected/0d4ea2b9-c6e4-4d27-866a-420be44d88f8-kube-api-access-rvb94\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:13 crc kubenswrapper[4669]: I1001 11:47:13.850238 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0d4ea2b9-c6e4-4d27-866a-420be44d88f8\") " pod="openstack/glance-default-external-api-0" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.112483 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.188399 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a85c-account-create-qmwxm"] Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.189819 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.195674 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.202787 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a85c-account-create-qmwxm"] Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.285573 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zxmx\" (UniqueName: \"kubernetes.io/projected/35e0ee30-1e97-4d8d-8f57-f94949a53291-kube-api-access-8zxmx\") pod \"nova-cell0-a85c-account-create-qmwxm\" (UID: \"35e0ee30-1e97-4d8d-8f57-f94949a53291\") " pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.294330 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0712d8cd-5673-4792-bafd-463179234f1d","Type":"ContainerStarted","Data":"dc0d0c6daeb70eb75aaf80686d219d0b5de1598c6a891c4def6d122c59bf1fb8"} Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.294393 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0712d8cd-5673-4792-bafd-463179234f1d","Type":"ContainerStarted","Data":"bb2494c1955a365b81d10962b744f7e3352ba1f52880313d9109e3cb678679f5"} Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.320281 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerStarted","Data":"35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5"} Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.320334 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerStarted","Data":"27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610"} Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.349779 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dc50b83-702d-4bf7-bee7-87ead33a1faa","Type":"ContainerStarted","Data":"f4aa182dd32da42bf435b3fa3e9351523209e697789d86024754d9cf1ce3a2d8"} Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.353367 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.353340774 podStartE2EDuration="3.353340774s" podCreationTimestamp="2025-10-01 11:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:47:14.341661349 +0000 UTC m=+1125.441226326" watchObservedRunningTime="2025-10-01 11:47:14.353340774 +0000 UTC m=+1125.452905751" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.389017 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zxmx\" (UniqueName: \"kubernetes.io/projected/35e0ee30-1e97-4d8d-8f57-f94949a53291-kube-api-access-8zxmx\") pod \"nova-cell0-a85c-account-create-qmwxm\" (UID: \"35e0ee30-1e97-4d8d-8f57-f94949a53291\") " pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.397850 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5926-account-create-lhhnq"] Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.399333 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.408802 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.416375 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5926-account-create-lhhnq"] Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.443970 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zxmx\" (UniqueName: \"kubernetes.io/projected/35e0ee30-1e97-4d8d-8f57-f94949a53291-kube-api-access-8zxmx\") pod \"nova-cell0-a85c-account-create-qmwxm\" (UID: \"35e0ee30-1e97-4d8d-8f57-f94949a53291\") " pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.461688 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.461652295 podStartE2EDuration="4.461652295s" podCreationTimestamp="2025-10-01 11:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:47:14.419375794 +0000 UTC m=+1125.518940771" watchObservedRunningTime="2025-10-01 11:47:14.461652295 +0000 UTC m=+1125.561217282" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.491034 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vdwh\" (UniqueName: \"kubernetes.io/projected/d88c22d8-fe18-470e-87c4-9ef21beeccce-kube-api-access-5vdwh\") pod \"nova-cell1-5926-account-create-lhhnq\" (UID: \"d88c22d8-fe18-470e-87c4-9ef21beeccce\") " pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.565595 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.593227 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vdwh\" (UniqueName: \"kubernetes.io/projected/d88c22d8-fe18-470e-87c4-9ef21beeccce-kube-api-access-5vdwh\") pod \"nova-cell1-5926-account-create-lhhnq\" (UID: \"d88c22d8-fe18-470e-87c4-9ef21beeccce\") " pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.641738 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vdwh\" (UniqueName: \"kubernetes.io/projected/d88c22d8-fe18-470e-87c4-9ef21beeccce-kube-api-access-5vdwh\") pod \"nova-cell1-5926-account-create-lhhnq\" (UID: \"d88c22d8-fe18-470e-87c4-9ef21beeccce\") " pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:14 crc kubenswrapper[4669]: I1001 11:47:14.816757 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:15 crc kubenswrapper[4669]: I1001 11:47:15.006228 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 11:47:15 crc kubenswrapper[4669]: I1001 11:47:15.235147 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a85c-account-create-qmwxm"] Oct 01 11:47:15 crc kubenswrapper[4669]: I1001 11:47:15.362511 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d4ea2b9-c6e4-4d27-866a-420be44d88f8","Type":"ContainerStarted","Data":"82986e77e88c4eacc84adac852fbb5ec2d7afdcb82de0cf0490ca34938a9184f"} Oct 01 11:47:15 crc kubenswrapper[4669]: I1001 11:47:15.367822 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a85c-account-create-qmwxm" event={"ID":"35e0ee30-1e97-4d8d-8f57-f94949a53291","Type":"ContainerStarted","Data":"13e62f72b8c5a15ce856101bb6cd803e3669007edb524c0363905b301d4c6d45"} Oct 01 11:47:15 crc kubenswrapper[4669]: I1001 11:47:15.385258 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5926-account-create-lhhnq"] Oct 01 11:47:15 crc kubenswrapper[4669]: I1001 11:47:15.724154 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.383303 4669 generic.go:334] "Generic (PLEG): container finished" podID="35e0ee30-1e97-4d8d-8f57-f94949a53291" containerID="4505d76d3141066e11a4ca1c6d2d359b10840e42c9705e611f9ebb14844fa9c2" exitCode=0 Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.383370 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a85c-account-create-qmwxm" event={"ID":"35e0ee30-1e97-4d8d-8f57-f94949a53291","Type":"ContainerDied","Data":"4505d76d3141066e11a4ca1c6d2d359b10840e42c9705e611f9ebb14844fa9c2"} Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.385928 4669 generic.go:334] "Generic (PLEG): container finished" podID="d88c22d8-fe18-470e-87c4-9ef21beeccce" containerID="baa275b2a84c20c97af4d6ca9eba17ead71a873fa28209b883370788d9055296" exitCode=0 Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.386033 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5926-account-create-lhhnq" event={"ID":"d88c22d8-fe18-470e-87c4-9ef21beeccce","Type":"ContainerDied","Data":"baa275b2a84c20c97af4d6ca9eba17ead71a873fa28209b883370788d9055296"} Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.386056 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5926-account-create-lhhnq" event={"ID":"d88c22d8-fe18-470e-87c4-9ef21beeccce","Type":"ContainerStarted","Data":"0f11c651cefedaba9cd34080fa663468c3c6b80d1bbd72c1948a7ca518409890"} Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.389499 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerStarted","Data":"d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53"} Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.389643 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-central-agent" containerID="cri-o://52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710" gracePeriod=30 Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.389895 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.389947 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="proxy-httpd" containerID="cri-o://d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53" gracePeriod=30 Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.390022 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="sg-core" containerID="cri-o://27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610" gracePeriod=30 Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.390063 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-notification-agent" containerID="cri-o://35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5" gracePeriod=30 Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.414185 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d4ea2b9-c6e4-4d27-866a-420be44d88f8","Type":"ContainerStarted","Data":"4a51ab38a94057301b8ff36e430f0351bd18b3a2ec2e34e69de094a6ce2cc06a"} Oct 01 11:47:16 crc kubenswrapper[4669]: I1001 11:47:16.465508 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.265339042 podStartE2EDuration="6.465478525s" podCreationTimestamp="2025-10-01 11:47:10 +0000 UTC" firstStartedPulling="2025-10-01 11:47:11.279388841 +0000 UTC m=+1122.378953818" lastFinishedPulling="2025-10-01 11:47:15.479528324 +0000 UTC m=+1126.579093301" observedRunningTime="2025-10-01 11:47:16.432585293 +0000 UTC m=+1127.532150270" watchObservedRunningTime="2025-10-01 11:47:16.465478525 +0000 UTC m=+1127.565043492" Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.430125 4669 generic.go:334] "Generic (PLEG): container finished" podID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerID="d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53" exitCode=0 Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.430540 4669 generic.go:334] "Generic (PLEG): container finished" podID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerID="27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610" exitCode=2 Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.430551 4669 generic.go:334] "Generic (PLEG): container finished" podID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerID="35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5" exitCode=0 Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.430604 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerDied","Data":"d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53"} Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.430640 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerDied","Data":"27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610"} Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.430649 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerDied","Data":"35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5"} Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.434499 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0d4ea2b9-c6e4-4d27-866a-420be44d88f8","Type":"ContainerStarted","Data":"fbf67bc6ed3e76093341508cfcacac84dc1eea92b8bda0cd453e0d1c72c82b3f"} Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.478392 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.478365752 podStartE2EDuration="4.478365752s" podCreationTimestamp="2025-10-01 11:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:47:17.471176397 +0000 UTC m=+1128.570741384" watchObservedRunningTime="2025-10-01 11:47:17.478365752 +0000 UTC m=+1128.577930729" Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.953538 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:17 crc kubenswrapper[4669]: I1001 11:47:17.959896 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.091067 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vdwh\" (UniqueName: \"kubernetes.io/projected/d88c22d8-fe18-470e-87c4-9ef21beeccce-kube-api-access-5vdwh\") pod \"d88c22d8-fe18-470e-87c4-9ef21beeccce\" (UID: \"d88c22d8-fe18-470e-87c4-9ef21beeccce\") " Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.091468 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zxmx\" (UniqueName: \"kubernetes.io/projected/35e0ee30-1e97-4d8d-8f57-f94949a53291-kube-api-access-8zxmx\") pod \"35e0ee30-1e97-4d8d-8f57-f94949a53291\" (UID: \"35e0ee30-1e97-4d8d-8f57-f94949a53291\") " Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.099118 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e0ee30-1e97-4d8d-8f57-f94949a53291-kube-api-access-8zxmx" (OuterVolumeSpecName: "kube-api-access-8zxmx") pod "35e0ee30-1e97-4d8d-8f57-f94949a53291" (UID: "35e0ee30-1e97-4d8d-8f57-f94949a53291"). InnerVolumeSpecName "kube-api-access-8zxmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.100226 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88c22d8-fe18-470e-87c4-9ef21beeccce-kube-api-access-5vdwh" (OuterVolumeSpecName: "kube-api-access-5vdwh") pod "d88c22d8-fe18-470e-87c4-9ef21beeccce" (UID: "d88c22d8-fe18-470e-87c4-9ef21beeccce"). InnerVolumeSpecName "kube-api-access-5vdwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.194406 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vdwh\" (UniqueName: \"kubernetes.io/projected/d88c22d8-fe18-470e-87c4-9ef21beeccce-kube-api-access-5vdwh\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.194456 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zxmx\" (UniqueName: \"kubernetes.io/projected/35e0ee30-1e97-4d8d-8f57-f94949a53291-kube-api-access-8zxmx\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.446900 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a85c-account-create-qmwxm" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.446942 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a85c-account-create-qmwxm" event={"ID":"35e0ee30-1e97-4d8d-8f57-f94949a53291","Type":"ContainerDied","Data":"13e62f72b8c5a15ce856101bb6cd803e3669007edb524c0363905b301d4c6d45"} Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.447004 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13e62f72b8c5a15ce856101bb6cd803e3669007edb524c0363905b301d4c6d45" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.450663 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5926-account-create-lhhnq" event={"ID":"d88c22d8-fe18-470e-87c4-9ef21beeccce","Type":"ContainerDied","Data":"0f11c651cefedaba9cd34080fa663468c3c6b80d1bbd72c1948a7ca518409890"} Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.450795 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f11c651cefedaba9cd34080fa663468c3c6b80d1bbd72c1948a7ca518409890" Oct 01 11:47:18 crc kubenswrapper[4669]: I1001 11:47:18.450689 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5926-account-create-lhhnq" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.475955 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gcjc4"] Oct 01 11:47:19 crc kubenswrapper[4669]: E1001 11:47:19.476665 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88c22d8-fe18-470e-87c4-9ef21beeccce" containerName="mariadb-account-create" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.476681 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c22d8-fe18-470e-87c4-9ef21beeccce" containerName="mariadb-account-create" Oct 01 11:47:19 crc kubenswrapper[4669]: E1001 11:47:19.476706 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e0ee30-1e97-4d8d-8f57-f94949a53291" containerName="mariadb-account-create" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.476715 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e0ee30-1e97-4d8d-8f57-f94949a53291" containerName="mariadb-account-create" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.476909 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e0ee30-1e97-4d8d-8f57-f94949a53291" containerName="mariadb-account-create" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.476920 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88c22d8-fe18-470e-87c4-9ef21beeccce" containerName="mariadb-account-create" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.477628 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.489729 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gcjc4"] Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.491380 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.491714 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8c5p5" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.491888 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.635155 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.635398 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-scripts\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.635469 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-config-data\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.635569 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxmnc\" (UniqueName: \"kubernetes.io/projected/699259f2-9bb3-42f1-b04f-d95ab275e1aa-kube-api-access-jxmnc\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.737489 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.737556 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-scripts\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.737581 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-config-data\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.737621 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxmnc\" (UniqueName: \"kubernetes.io/projected/699259f2-9bb3-42f1-b04f-d95ab275e1aa-kube-api-access-jxmnc\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.742792 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-scripts\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.750905 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.753542 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-config-data\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.761944 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxmnc\" (UniqueName: \"kubernetes.io/projected/699259f2-9bb3-42f1-b04f-d95ab275e1aa-kube-api-access-jxmnc\") pod \"nova-cell0-conductor-db-sync-gcjc4\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:19 crc kubenswrapper[4669]: I1001 11:47:19.799019 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:20 crc kubenswrapper[4669]: I1001 11:47:20.362010 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gcjc4"] Oct 01 11:47:20 crc kubenswrapper[4669]: I1001 11:47:20.479401 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" event={"ID":"699259f2-9bb3-42f1-b04f-d95ab275e1aa","Type":"ContainerStarted","Data":"60f649dffc3a15f99dc7b070fbeef8bf19ca14df8dacd4eca36e7ec58ef96ca9"} Oct 01 11:47:21 crc kubenswrapper[4669]: I1001 11:47:21.014781 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 11:47:21 crc kubenswrapper[4669]: I1001 11:47:21.507772 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:21 crc kubenswrapper[4669]: I1001 11:47:21.508327 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:21 crc kubenswrapper[4669]: I1001 11:47:21.568688 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:21 crc kubenswrapper[4669]: I1001 11:47:21.569226 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:22 crc kubenswrapper[4669]: I1001 11:47:22.535457 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:22 crc kubenswrapper[4669]: I1001 11:47:22.535500 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.230853 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324412 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-combined-ca-bundle\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324496 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-sg-core-conf-yaml\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324620 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-config-data\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324653 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vcz8\" (UniqueName: \"kubernetes.io/projected/ea585ed6-b11c-485f-87d9-47145a877b8b-kube-api-access-7vcz8\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324701 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-scripts\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324731 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-log-httpd\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.324812 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-run-httpd\") pod \"ea585ed6-b11c-485f-87d9-47145a877b8b\" (UID: \"ea585ed6-b11c-485f-87d9-47145a877b8b\") " Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.325634 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.325841 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.344949 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-scripts" (OuterVolumeSpecName: "scripts") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.347258 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea585ed6-b11c-485f-87d9-47145a877b8b-kube-api-access-7vcz8" (OuterVolumeSpecName: "kube-api-access-7vcz8") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "kube-api-access-7vcz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.361699 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.410162 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.427550 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.427585 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.427595 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vcz8\" (UniqueName: \"kubernetes.io/projected/ea585ed6-b11c-485f-87d9-47145a877b8b-kube-api-access-7vcz8\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.427610 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.427621 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.427631 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea585ed6-b11c-485f-87d9-47145a877b8b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.434664 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-config-data" (OuterVolumeSpecName: "config-data") pod "ea585ed6-b11c-485f-87d9-47145a877b8b" (UID: "ea585ed6-b11c-485f-87d9-47145a877b8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.530351 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea585ed6-b11c-485f-87d9-47145a877b8b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.550295 4669 generic.go:334] "Generic (PLEG): container finished" podID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerID="52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710" exitCode=0 Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.550384 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.550464 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerDied","Data":"52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710"} Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.550498 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea585ed6-b11c-485f-87d9-47145a877b8b","Type":"ContainerDied","Data":"8520cccbf9920e80782d8d92420310c1d834d04576b00eeb5bae551895a2fe07"} Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.550519 4669 scope.go:117] "RemoveContainer" containerID="d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.593028 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.594035 4669 scope.go:117] "RemoveContainer" containerID="27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.603941 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.625982 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.626657 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-notification-agent" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.626678 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-notification-agent" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.626717 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="proxy-httpd" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.626726 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="proxy-httpd" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.626746 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-central-agent" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.626755 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-central-agent" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.626801 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="sg-core" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.626810 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="sg-core" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.627059 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-notification-agent" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.627103 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="sg-core" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.627118 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="ceilometer-central-agent" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.627131 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" containerName="proxy-httpd" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.629697 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.633275 4669 scope.go:117] "RemoveContainer" containerID="35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.634359 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.638459 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.640294 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.669509 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea585ed6-b11c-485f-87d9-47145a877b8b" path="/var/lib/kubelet/pods/ea585ed6-b11c-485f-87d9-47145a877b8b/volumes" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.674121 4669 scope.go:117] "RemoveContainer" containerID="52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.704095 4669 scope.go:117] "RemoveContainer" containerID="d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.705294 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53\": container with ID starting with d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53 not found: ID does not exist" containerID="d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.705333 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53"} err="failed to get container status \"d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53\": rpc error: code = NotFound desc = could not find container \"d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53\": container with ID starting with d0a1a72968d15774f4ad53a19c9f67dd44d07c4e017c61c692e2aef394dbaa53 not found: ID does not exist" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.705362 4669 scope.go:117] "RemoveContainer" containerID="27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.705854 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610\": container with ID starting with 27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610 not found: ID does not exist" containerID="27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.705924 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610"} err="failed to get container status \"27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610\": rpc error: code = NotFound desc = could not find container \"27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610\": container with ID starting with 27aedb5b1668e56154305238c24360a2993cd161dc42a06ccd448ee48d79d610 not found: ID does not exist" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.705941 4669 scope.go:117] "RemoveContainer" containerID="35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.706291 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5\": container with ID starting with 35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5 not found: ID does not exist" containerID="35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.706376 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5"} err="failed to get container status \"35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5\": rpc error: code = NotFound desc = could not find container \"35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5\": container with ID starting with 35cf05d5ddce1c7d0f76daa417008612693b08d769e1144412cc822da261c6b5 not found: ID does not exist" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.706413 4669 scope.go:117] "RemoveContainer" containerID="52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710" Oct 01 11:47:23 crc kubenswrapper[4669]: E1001 11:47:23.706751 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710\": container with ID starting with 52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710 not found: ID does not exist" containerID="52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.706783 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710"} err="failed to get container status \"52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710\": rpc error: code = NotFound desc = could not find container \"52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710\": container with ID starting with 52ea3a3eb0e96cc4b4c1461d407a5027adec0d3a5e69fd9023c5176c3d659710 not found: ID does not exist" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745526 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745588 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-config-data\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745701 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-run-httpd\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745768 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745799 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnmfw\" (UniqueName: \"kubernetes.io/projected/407bdb42-6af5-4792-835e-6c5f46b7df50-kube-api-access-cnmfw\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745835 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-log-httpd\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.745911 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-scripts\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.848960 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-scripts\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849111 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849132 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-config-data\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849217 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-run-httpd\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849273 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849300 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnmfw\" (UniqueName: \"kubernetes.io/projected/407bdb42-6af5-4792-835e-6c5f46b7df50-kube-api-access-cnmfw\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849331 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-log-httpd\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.849860 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-log-httpd\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.850951 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-run-httpd\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.855283 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-scripts\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.861028 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.865966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.868865 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-config-data\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.873062 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnmfw\" (UniqueName: \"kubernetes.io/projected/407bdb42-6af5-4792-835e-6c5f46b7df50-kube-api-access-cnmfw\") pod \"ceilometer-0\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " pod="openstack/ceilometer-0" Oct 01 11:47:23 crc kubenswrapper[4669]: I1001 11:47:23.967114 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.047552 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c7ad-account-create-jf4nw"] Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.048963 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.052993 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.062943 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7ad-account-create-jf4nw"] Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.114284 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.114346 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.157262 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7lv\" (UniqueName: \"kubernetes.io/projected/3b9cae74-a51f-4d18-949d-ca999e48f5e3-kube-api-access-cb7lv\") pod \"nova-api-c7ad-account-create-jf4nw\" (UID: \"3b9cae74-a51f-4d18-949d-ca999e48f5e3\") " pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.157868 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.172615 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.259182 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7lv\" (UniqueName: \"kubernetes.io/projected/3b9cae74-a51f-4d18-949d-ca999e48f5e3-kube-api-access-cb7lv\") pod \"nova-api-c7ad-account-create-jf4nw\" (UID: \"3b9cae74-a51f-4d18-949d-ca999e48f5e3\") " pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.290922 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7lv\" (UniqueName: \"kubernetes.io/projected/3b9cae74-a51f-4d18-949d-ca999e48f5e3-kube-api-access-cb7lv\") pod \"nova-api-c7ad-account-create-jf4nw\" (UID: \"3b9cae74-a51f-4d18-949d-ca999e48f5e3\") " pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.385612 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.581057 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.581814 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.695422 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.695533 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:47:24 crc kubenswrapper[4669]: I1001 11:47:24.732052 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 11:47:26 crc kubenswrapper[4669]: I1001 11:47:26.603835 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:47:26 crc kubenswrapper[4669]: I1001 11:47:26.603874 4669 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 11:47:27 crc kubenswrapper[4669]: I1001 11:47:27.092737 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 11:47:27 crc kubenswrapper[4669]: I1001 11:47:27.098192 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.439372 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7ad-account-create-jf4nw"] Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.448104 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.578778 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.652684 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" event={"ID":"699259f2-9bb3-42f1-b04f-d95ab275e1aa","Type":"ContainerStarted","Data":"1824df4216d73949ef552fe39392a129a7190f6916b89dfbc645ae424399fdb1"} Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.656095 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-jf4nw" event={"ID":"3b9cae74-a51f-4d18-949d-ca999e48f5e3","Type":"ContainerStarted","Data":"5ab148946e5438398447ef1a5981acc07e37f691e0d3f16990af903bb543284b"} Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.656193 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-jf4nw" event={"ID":"3b9cae74-a51f-4d18-949d-ca999e48f5e3","Type":"ContainerStarted","Data":"0c4eb6cda640c1529609bbbff7a080c9f4dfdc7feb1de7fb8c62aa2f4533626f"} Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.667125 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerStarted","Data":"a95aa99fd97e37448fcf6cf5c76e4a5c8f3037347bfd3bf9299e591e5ec14973"} Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.695068 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" podStartSLOduration=2.10083015 podStartE2EDuration="11.695039817s" podCreationTimestamp="2025-10-01 11:47:19 +0000 UTC" firstStartedPulling="2025-10-01 11:47:20.372007669 +0000 UTC m=+1131.471572646" lastFinishedPulling="2025-10-01 11:47:29.966217336 +0000 UTC m=+1141.065782313" observedRunningTime="2025-10-01 11:47:30.680188075 +0000 UTC m=+1141.779753062" watchObservedRunningTime="2025-10-01 11:47:30.695039817 +0000 UTC m=+1141.794604794" Oct 01 11:47:30 crc kubenswrapper[4669]: I1001 11:47:30.709144 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-c7ad-account-create-jf4nw" podStartSLOduration=6.709124911 podStartE2EDuration="6.709124911s" podCreationTimestamp="2025-10-01 11:47:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:47:30.707722437 +0000 UTC m=+1141.807287414" watchObservedRunningTime="2025-10-01 11:47:30.709124911 +0000 UTC m=+1141.808689878" Oct 01 11:47:31 crc kubenswrapper[4669]: I1001 11:47:31.679788 4669 generic.go:334] "Generic (PLEG): container finished" podID="3b9cae74-a51f-4d18-949d-ca999e48f5e3" containerID="5ab148946e5438398447ef1a5981acc07e37f691e0d3f16990af903bb543284b" exitCode=0 Oct 01 11:47:31 crc kubenswrapper[4669]: I1001 11:47:31.679836 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-jf4nw" event={"ID":"3b9cae74-a51f-4d18-949d-ca999e48f5e3","Type":"ContainerDied","Data":"5ab148946e5438398447ef1a5981acc07e37f691e0d3f16990af903bb543284b"} Oct 01 11:47:31 crc kubenswrapper[4669]: I1001 11:47:31.683454 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerStarted","Data":"738d4311e1ba2ea07bf6f6fc408fbbaaef57f327bac48ba9aa7a01f478367945"} Oct 01 11:47:32 crc kubenswrapper[4669]: I1001 11:47:32.695679 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerStarted","Data":"c6744064249f4ad829a91e0e9d117bca8eead715f1762585c813d87f8e9d2977"} Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.166178 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.195050 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7lv\" (UniqueName: \"kubernetes.io/projected/3b9cae74-a51f-4d18-949d-ca999e48f5e3-kube-api-access-cb7lv\") pod \"3b9cae74-a51f-4d18-949d-ca999e48f5e3\" (UID: \"3b9cae74-a51f-4d18-949d-ca999e48f5e3\") " Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.203105 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9cae74-a51f-4d18-949d-ca999e48f5e3-kube-api-access-cb7lv" (OuterVolumeSpecName: "kube-api-access-cb7lv") pod "3b9cae74-a51f-4d18-949d-ca999e48f5e3" (UID: "3b9cae74-a51f-4d18-949d-ca999e48f5e3"). InnerVolumeSpecName "kube-api-access-cb7lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.299988 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7lv\" (UniqueName: \"kubernetes.io/projected/3b9cae74-a51f-4d18-949d-ca999e48f5e3-kube-api-access-cb7lv\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.710341 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7ad-account-create-jf4nw" event={"ID":"3b9cae74-a51f-4d18-949d-ca999e48f5e3","Type":"ContainerDied","Data":"0c4eb6cda640c1529609bbbff7a080c9f4dfdc7feb1de7fb8c62aa2f4533626f"} Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.710418 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c4eb6cda640c1529609bbbff7a080c9f4dfdc7feb1de7fb8c62aa2f4533626f" Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.710509 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7ad-account-create-jf4nw" Oct 01 11:47:33 crc kubenswrapper[4669]: I1001 11:47:33.714090 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerStarted","Data":"87768896b9a8142dbc88c3f0346fa136f0c2fd00285f643aa8788b7a0e43e2d1"} Oct 01 11:47:34 crc kubenswrapper[4669]: I1001 11:47:34.729979 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerStarted","Data":"e011c9031e816a630b0d59818446f8fdc972bd75bfe9aa49abc9b31d811c6233"} Oct 01 11:47:34 crc kubenswrapper[4669]: I1001 11:47:34.730580 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:47:34 crc kubenswrapper[4669]: I1001 11:47:34.758739 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.943857995 podStartE2EDuration="11.758711593s" podCreationTimestamp="2025-10-01 11:47:23 +0000 UTC" firstStartedPulling="2025-10-01 11:47:30.59093685 +0000 UTC m=+1141.690501827" lastFinishedPulling="2025-10-01 11:47:34.405790418 +0000 UTC m=+1145.505355425" observedRunningTime="2025-10-01 11:47:34.757176715 +0000 UTC m=+1145.856741702" watchObservedRunningTime="2025-10-01 11:47:34.758711593 +0000 UTC m=+1145.858276570" Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.295524 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.296595 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-central-agent" containerID="cri-o://738d4311e1ba2ea07bf6f6fc408fbbaaef57f327bac48ba9aa7a01f478367945" gracePeriod=30 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.297239 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="proxy-httpd" containerID="cri-o://e011c9031e816a630b0d59818446f8fdc972bd75bfe9aa49abc9b31d811c6233" gracePeriod=30 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.297295 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="sg-core" containerID="cri-o://87768896b9a8142dbc88c3f0346fa136f0c2fd00285f643aa8788b7a0e43e2d1" gracePeriod=30 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.297332 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-notification-agent" containerID="cri-o://c6744064249f4ad829a91e0e9d117bca8eead715f1762585c813d87f8e9d2977" gracePeriod=30 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.795546 4669 generic.go:334] "Generic (PLEG): container finished" podID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerID="e011c9031e816a630b0d59818446f8fdc972bd75bfe9aa49abc9b31d811c6233" exitCode=0 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.795593 4669 generic.go:334] "Generic (PLEG): container finished" podID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerID="87768896b9a8142dbc88c3f0346fa136f0c2fd00285f643aa8788b7a0e43e2d1" exitCode=2 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.795608 4669 generic.go:334] "Generic (PLEG): container finished" podID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerID="c6744064249f4ad829a91e0e9d117bca8eead715f1762585c813d87f8e9d2977" exitCode=0 Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.795652 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerDied","Data":"e011c9031e816a630b0d59818446f8fdc972bd75bfe9aa49abc9b31d811c6233"} Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.795730 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerDied","Data":"87768896b9a8142dbc88c3f0346fa136f0c2fd00285f643aa8788b7a0e43e2d1"} Oct 01 11:47:39 crc kubenswrapper[4669]: I1001 11:47:39.795758 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerDied","Data":"c6744064249f4ad829a91e0e9d117bca8eead715f1762585c813d87f8e9d2977"} Oct 01 11:47:41 crc kubenswrapper[4669]: I1001 11:47:41.822337 4669 generic.go:334] "Generic (PLEG): container finished" podID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerID="738d4311e1ba2ea07bf6f6fc408fbbaaef57f327bac48ba9aa7a01f478367945" exitCode=0 Oct 01 11:47:41 crc kubenswrapper[4669]: I1001 11:47:41.823002 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerDied","Data":"738d4311e1ba2ea07bf6f6fc408fbbaaef57f327bac48ba9aa7a01f478367945"} Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.056934 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.129399 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnmfw\" (UniqueName: \"kubernetes.io/projected/407bdb42-6af5-4792-835e-6c5f46b7df50-kube-api-access-cnmfw\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.129610 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-run-httpd\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.129837 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-sg-core-conf-yaml\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.129899 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-combined-ca-bundle\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.129957 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-scripts\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.130043 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-log-httpd\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.130180 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-config-data\") pod \"407bdb42-6af5-4792-835e-6c5f46b7df50\" (UID: \"407bdb42-6af5-4792-835e-6c5f46b7df50\") " Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.131363 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.133326 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.156327 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-scripts" (OuterVolumeSpecName: "scripts") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.156612 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407bdb42-6af5-4792-835e-6c5f46b7df50-kube-api-access-cnmfw" (OuterVolumeSpecName: "kube-api-access-cnmfw") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "kube-api-access-cnmfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.169877 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.223118 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.235791 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.235860 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.235880 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnmfw\" (UniqueName: \"kubernetes.io/projected/407bdb42-6af5-4792-835e-6c5f46b7df50-kube-api-access-cnmfw\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.235899 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/407bdb42-6af5-4792-835e-6c5f46b7df50-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.235917 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.235935 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.257269 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-config-data" (OuterVolumeSpecName: "config-data") pod "407bdb42-6af5-4792-835e-6c5f46b7df50" (UID: "407bdb42-6af5-4792-835e-6c5f46b7df50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.338201 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdb42-6af5-4792-835e-6c5f46b7df50-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.844855 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"407bdb42-6af5-4792-835e-6c5f46b7df50","Type":"ContainerDied","Data":"a95aa99fd97e37448fcf6cf5c76e4a5c8f3037347bfd3bf9299e591e5ec14973"} Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.844943 4669 scope.go:117] "RemoveContainer" containerID="e011c9031e816a630b0d59818446f8fdc972bd75bfe9aa49abc9b31d811c6233" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.848522 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.875649 4669 scope.go:117] "RemoveContainer" containerID="87768896b9a8142dbc88c3f0346fa136f0c2fd00285f643aa8788b7a0e43e2d1" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.896376 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.905382 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.911006 4669 scope.go:117] "RemoveContainer" containerID="c6744064249f4ad829a91e0e9d117bca8eead715f1762585c813d87f8e9d2977" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.961822 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:42 crc kubenswrapper[4669]: E1001 11:47:42.963063 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-notification-agent" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.963103 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-notification-agent" Oct 01 11:47:42 crc kubenswrapper[4669]: E1001 11:47:42.963129 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="sg-core" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.963135 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="sg-core" Oct 01 11:47:42 crc kubenswrapper[4669]: E1001 11:47:42.963160 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="proxy-httpd" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.963167 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="proxy-httpd" Oct 01 11:47:42 crc kubenswrapper[4669]: E1001 11:47:42.963185 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9cae74-a51f-4d18-949d-ca999e48f5e3" containerName="mariadb-account-create" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.963194 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9cae74-a51f-4d18-949d-ca999e48f5e3" containerName="mariadb-account-create" Oct 01 11:47:42 crc kubenswrapper[4669]: E1001 11:47:42.963214 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-central-agent" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.963221 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-central-agent" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.979387 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9cae74-a51f-4d18-949d-ca999e48f5e3" containerName="mariadb-account-create" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.979457 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-notification-agent" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.979482 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="ceilometer-central-agent" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.979512 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="sg-core" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.979541 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" containerName="proxy-httpd" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.987767 4669 scope.go:117] "RemoveContainer" containerID="738d4311e1ba2ea07bf6f6fc408fbbaaef57f327bac48ba9aa7a01f478367945" Oct 01 11:47:42 crc kubenswrapper[4669]: I1001 11:47:42.990265 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.007940 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.010973 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.012897 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.068489 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.068887 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.068983 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-scripts\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.069094 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-config-data\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.069303 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffmr\" (UniqueName: \"kubernetes.io/projected/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-kube-api-access-qffmr\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.069379 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.069633 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172253 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-scripts\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172325 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-config-data\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172354 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffmr\" (UniqueName: \"kubernetes.io/projected/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-kube-api-access-qffmr\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172375 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172415 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172479 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.172531 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.173434 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.173496 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.179363 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-config-data\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.179704 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-scripts\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.185733 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.190105 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.201377 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffmr\" (UniqueName: \"kubernetes.io/projected/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-kube-api-access-qffmr\") pod \"ceilometer-0\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.327071 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.657198 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407bdb42-6af5-4792-835e-6c5f46b7df50" path="/var/lib/kubelet/pods/407bdb42-6af5-4792-835e-6c5f46b7df50/volumes" Oct 01 11:47:43 crc kubenswrapper[4669]: I1001 11:47:43.889232 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:47:44 crc kubenswrapper[4669]: I1001 11:47:44.878461 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerStarted","Data":"3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6"} Oct 01 11:47:44 crc kubenswrapper[4669]: I1001 11:47:44.878797 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerStarted","Data":"caa33e2b2841761364de7bdc39c0f693e2120fbbbb3f7e04fcc94836510737c6"} Oct 01 11:47:45 crc kubenswrapper[4669]: I1001 11:47:45.913331 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerStarted","Data":"f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7"} Oct 01 11:47:46 crc kubenswrapper[4669]: I1001 11:47:46.925700 4669 generic.go:334] "Generic (PLEG): container finished" podID="699259f2-9bb3-42f1-b04f-d95ab275e1aa" containerID="1824df4216d73949ef552fe39392a129a7190f6916b89dfbc645ae424399fdb1" exitCode=0 Oct 01 11:47:46 crc kubenswrapper[4669]: I1001 11:47:46.925818 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" event={"ID":"699259f2-9bb3-42f1-b04f-d95ab275e1aa","Type":"ContainerDied","Data":"1824df4216d73949ef552fe39392a129a7190f6916b89dfbc645ae424399fdb1"} Oct 01 11:47:46 crc kubenswrapper[4669]: I1001 11:47:46.928899 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerStarted","Data":"ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30"} Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.346368 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.520701 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxmnc\" (UniqueName: \"kubernetes.io/projected/699259f2-9bb3-42f1-b04f-d95ab275e1aa-kube-api-access-jxmnc\") pod \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.520786 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-config-data\") pod \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.521032 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-scripts\") pod \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.521096 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-combined-ca-bundle\") pod \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\" (UID: \"699259f2-9bb3-42f1-b04f-d95ab275e1aa\") " Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.528942 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-scripts" (OuterVolumeSpecName: "scripts") pod "699259f2-9bb3-42f1-b04f-d95ab275e1aa" (UID: "699259f2-9bb3-42f1-b04f-d95ab275e1aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.534520 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699259f2-9bb3-42f1-b04f-d95ab275e1aa-kube-api-access-jxmnc" (OuterVolumeSpecName: "kube-api-access-jxmnc") pod "699259f2-9bb3-42f1-b04f-d95ab275e1aa" (UID: "699259f2-9bb3-42f1-b04f-d95ab275e1aa"). InnerVolumeSpecName "kube-api-access-jxmnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.553662 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-config-data" (OuterVolumeSpecName: "config-data") pod "699259f2-9bb3-42f1-b04f-d95ab275e1aa" (UID: "699259f2-9bb3-42f1-b04f-d95ab275e1aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.555036 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "699259f2-9bb3-42f1-b04f-d95ab275e1aa" (UID: "699259f2-9bb3-42f1-b04f-d95ab275e1aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.624660 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.626277 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.639739 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxmnc\" (UniqueName: \"kubernetes.io/projected/699259f2-9bb3-42f1-b04f-d95ab275e1aa-kube-api-access-jxmnc\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.639769 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699259f2-9bb3-42f1-b04f-d95ab275e1aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.953463 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerStarted","Data":"c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4"} Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.953566 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.956554 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" event={"ID":"699259f2-9bb3-42f1-b04f-d95ab275e1aa","Type":"ContainerDied","Data":"60f649dffc3a15f99dc7b070fbeef8bf19ca14df8dacd4eca36e7ec58ef96ca9"} Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.956617 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gcjc4" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.956645 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f649dffc3a15f99dc7b070fbeef8bf19ca14df8dacd4eca36e7ec58ef96ca9" Oct 01 11:47:48 crc kubenswrapper[4669]: I1001 11:47:48.996315 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.086406416 podStartE2EDuration="6.996288382s" podCreationTimestamp="2025-10-01 11:47:42 +0000 UTC" firstStartedPulling="2025-10-01 11:47:43.897471236 +0000 UTC m=+1154.997036223" lastFinishedPulling="2025-10-01 11:47:47.807353212 +0000 UTC m=+1158.906918189" observedRunningTime="2025-10-01 11:47:48.985920479 +0000 UTC m=+1160.085485496" watchObservedRunningTime="2025-10-01 11:47:48.996288382 +0000 UTC m=+1160.095853359" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.083732 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 11:47:49 crc kubenswrapper[4669]: E1001 11:47:49.084365 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699259f2-9bb3-42f1-b04f-d95ab275e1aa" containerName="nova-cell0-conductor-db-sync" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.084424 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="699259f2-9bb3-42f1-b04f-d95ab275e1aa" containerName="nova-cell0-conductor-db-sync" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.084713 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="699259f2-9bb3-42f1-b04f-d95ab275e1aa" containerName="nova-cell0-conductor-db-sync" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.085595 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.089294 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8c5p5" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.089578 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.106733 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.151022 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e646b8-72fe-4762-a24b-a74ddfb6be97-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.151092 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e646b8-72fe-4762-a24b-a74ddfb6be97-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.151172 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w25vx\" (UniqueName: \"kubernetes.io/projected/35e646b8-72fe-4762-a24b-a74ddfb6be97-kube-api-access-w25vx\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.253293 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e646b8-72fe-4762-a24b-a74ddfb6be97-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.253369 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e646b8-72fe-4762-a24b-a74ddfb6be97-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.253422 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w25vx\" (UniqueName: \"kubernetes.io/projected/35e646b8-72fe-4762-a24b-a74ddfb6be97-kube-api-access-w25vx\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.267026 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e646b8-72fe-4762-a24b-a74ddfb6be97-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.268969 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e646b8-72fe-4762-a24b-a74ddfb6be97-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.273812 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w25vx\" (UniqueName: \"kubernetes.io/projected/35e646b8-72fe-4762-a24b-a74ddfb6be97-kube-api-access-w25vx\") pod \"nova-cell0-conductor-0\" (UID: \"35e646b8-72fe-4762-a24b-a74ddfb6be97\") " pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.404209 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.724547 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 11:47:49 crc kubenswrapper[4669]: I1001 11:47:49.968546 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"35e646b8-72fe-4762-a24b-a74ddfb6be97","Type":"ContainerStarted","Data":"d7ca7a1ff12bc2adde17a8ff65192d409394e25c7bac9eb74988547c5bf9dcae"} Oct 01 11:47:50 crc kubenswrapper[4669]: I1001 11:47:50.980430 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"35e646b8-72fe-4762-a24b-a74ddfb6be97","Type":"ContainerStarted","Data":"c4d480af4f8fa4a843223b43acd196f9caf7f0dc8ff37cc4c2a386b128346bd9"} Oct 01 11:47:50 crc kubenswrapper[4669]: I1001 11:47:50.981007 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:51 crc kubenswrapper[4669]: I1001 11:47:51.018992 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.018964942 podStartE2EDuration="2.018964942s" podCreationTimestamp="2025-10-01 11:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:47:51.013394776 +0000 UTC m=+1162.112959753" watchObservedRunningTime="2025-10-01 11:47:51.018964942 +0000 UTC m=+1162.118529909" Oct 01 11:47:59 crc kubenswrapper[4669]: I1001 11:47:59.439468 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 11:47:59 crc kubenswrapper[4669]: I1001 11:47:59.958463 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-csn4t"] Oct 01 11:47:59 crc kubenswrapper[4669]: I1001 11:47:59.959873 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:47:59 crc kubenswrapper[4669]: I1001 11:47:59.962093 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 11:47:59 crc kubenswrapper[4669]: I1001 11:47:59.975021 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 11:47:59 crc kubenswrapper[4669]: I1001 11:47:59.998291 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-csn4t"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.019647 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.019851 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8nm2\" (UniqueName: \"kubernetes.io/projected/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-kube-api-access-l8nm2\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.020146 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-scripts\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.020220 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-config-data\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.122824 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.123452 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8nm2\" (UniqueName: \"kubernetes.io/projected/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-kube-api-access-l8nm2\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.123609 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-scripts\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.123657 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-config-data\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.123094 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.128450 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.141792 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.146176 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.148716 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.151722 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-scripts\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.153848 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-config-data\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.166828 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8nm2\" (UniqueName: \"kubernetes.io/projected/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-kube-api-access-l8nm2\") pod \"nova-cell0-cell-mapping-csn4t\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.228512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.228564 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-config-data\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.228636 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a3499b-1be2-457a-adc1-896f0297fc17-logs\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.228673 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpfw\" (UniqueName: \"kubernetes.io/projected/27a3499b-1be2-457a-adc1-896f0297fc17-kube-api-access-shpfw\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.237867 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.239298 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.243543 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.280492 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.291447 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332264 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4954\" (UniqueName: \"kubernetes.io/projected/86b8cae7-2784-4192-8b18-3cd38d6123f3-kube-api-access-c4954\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332325 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332352 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-config-data\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332397 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332456 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332480 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a3499b-1be2-457a-adc1-896f0297fc17-logs\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.332529 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shpfw\" (UniqueName: \"kubernetes.io/projected/27a3499b-1be2-457a-adc1-896f0297fc17-kube-api-access-shpfw\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.334601 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a3499b-1be2-457a-adc1-896f0297fc17-logs\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.348413 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.350419 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.360468 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.360647 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-config-data\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.361177 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.371773 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.372245 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpfw\" (UniqueName: \"kubernetes.io/projected/27a3499b-1be2-457a-adc1-896f0297fc17-kube-api-access-shpfw\") pod \"nova-api-0\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.437753 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.437827 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4ml\" (UniqueName: \"kubernetes.io/projected/d5ae9886-4487-4150-83bc-cbb211c9dea0-kube-api-access-kn4ml\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.437880 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.437924 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-config-data\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.437988 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae9886-4487-4150-83bc-cbb211c9dea0-logs\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.438104 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4954\" (UniqueName: \"kubernetes.io/projected/86b8cae7-2784-4192-8b18-3cd38d6123f3-kube-api-access-c4954\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.438146 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.451226 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.454697 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.493376 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4954\" (UniqueName: \"kubernetes.io/projected/86b8cae7-2784-4192-8b18-3cd38d6123f3-kube-api-access-c4954\") pod \"nova-cell1-novncproxy-0\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.540067 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4ml\" (UniqueName: \"kubernetes.io/projected/d5ae9886-4487-4150-83bc-cbb211c9dea0-kube-api-access-kn4ml\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.540159 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-config-data\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.540213 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae9886-4487-4150-83bc-cbb211c9dea0-logs\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.540288 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.541643 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae9886-4487-4150-83bc-cbb211c9dea0-logs\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.542334 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.543936 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.548624 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.550331 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-config-data\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.552739 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.561162 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.571154 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.587111 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4ml\" (UniqueName: \"kubernetes.io/projected/d5ae9886-4487-4150-83bc-cbb211c9dea0-kube-api-access-kn4ml\") pod \"nova-metadata-0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.587745 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.605270 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-wfg8j"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.615942 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.617995 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-wfg8j"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.751964 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-454lp\" (UniqueName: \"kubernetes.io/projected/2ba0e6c3-7c71-45df-acc3-127bec5afd42-kube-api-access-454lp\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.752543 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.752579 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-config-data\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.752804 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.752976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkxf\" (UniqueName: \"kubernetes.io/projected/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-kube-api-access-6hkxf\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.753016 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.753692 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.753790 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.753856 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-config\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.857866 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858679 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-config\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858774 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-454lp\" (UniqueName: \"kubernetes.io/projected/2ba0e6c3-7c71-45df-acc3-127bec5afd42-kube-api-access-454lp\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858798 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-config-data\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858817 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858860 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858897 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.858916 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkxf\" (UniqueName: \"kubernetes.io/projected/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-kube-api-access-6hkxf\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.859006 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.859030 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.860129 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.860182 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.860386 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-config\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.861050 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.861595 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.866566 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-config-data\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.882825 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.890098 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-454lp\" (UniqueName: \"kubernetes.io/projected/2ba0e6c3-7c71-45df-acc3-127bec5afd42-kube-api-access-454lp\") pod \"dnsmasq-dns-845d6d6f59-wfg8j\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.912148 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkxf\" (UniqueName: \"kubernetes.io/projected/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-kube-api-access-6hkxf\") pod \"nova-scheduler-0\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.951637 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-csn4t"] Oct 01 11:48:00 crc kubenswrapper[4669]: I1001 11:48:00.957369 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:00 crc kubenswrapper[4669]: W1001 11:48:00.961210 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f73cdc5_5e4b_4242_a588_2b8b4aa2f1b0.slice/crio-60d2b8d95fc24247f3f683240000bf50c5d07649995861bb1f36fc96cac80d10 WatchSource:0}: Error finding container 60d2b8d95fc24247f3f683240000bf50c5d07649995861bb1f36fc96cac80d10: Status 404 returned error can't find the container with id 60d2b8d95fc24247f3f683240000bf50c5d07649995861bb1f36fc96cac80d10 Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.137353 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csn4t" event={"ID":"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0","Type":"ContainerStarted","Data":"60d2b8d95fc24247f3f683240000bf50c5d07649995861bb1f36fc96cac80d10"} Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.192800 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.384001 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.527008 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7d7q"] Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.529836 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.535664 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.535986 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.575487 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7d7q"] Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.600107 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.685940 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flvbz\" (UniqueName: \"kubernetes.io/projected/688eb6a7-b463-4b36-9ef7-a365cbabac1f-kube-api-access-flvbz\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.686484 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-config-data\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.686557 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.686581 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-scripts\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.748722 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.788856 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-config-data\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.789007 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.789060 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-scripts\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.789136 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flvbz\" (UniqueName: \"kubernetes.io/projected/688eb6a7-b463-4b36-9ef7-a365cbabac1f-kube-api-access-flvbz\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.796967 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-config-data\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.806408 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-scripts\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.806717 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.809594 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flvbz\" (UniqueName: \"kubernetes.io/projected/688eb6a7-b463-4b36-9ef7-a365cbabac1f-kube-api-access-flvbz\") pod \"nova-cell1-conductor-db-sync-c7d7q\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.871591 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:01 crc kubenswrapper[4669]: I1001 11:48:01.928679 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-wfg8j"] Oct 01 11:48:01 crc kubenswrapper[4669]: W1001 11:48:01.935609 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ba0e6c3_7c71_45df_acc3_127bec5afd42.slice/crio-2dede691fa28e0d1a1f22827d6d512207a66f743103c9f2b622e04d1267c4b0b WatchSource:0}: Error finding container 2dede691fa28e0d1a1f22827d6d512207a66f743103c9f2b622e04d1267c4b0b: Status 404 returned error can't find the container with id 2dede691fa28e0d1a1f22827d6d512207a66f743103c9f2b622e04d1267c4b0b Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.049530 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.176019 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5ae9886-4487-4150-83bc-cbb211c9dea0","Type":"ContainerStarted","Data":"202498e1b27ea0427b42961da482596f65947fd4f8adeb9b0488f2b34a8140a2"} Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.188182 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86b8cae7-2784-4192-8b18-3cd38d6123f3","Type":"ContainerStarted","Data":"6099c7507ac33384031bdabff7c46594c39a6b6653abbf652108af4b70bee882"} Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.189759 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27a3499b-1be2-457a-adc1-896f0297fc17","Type":"ContainerStarted","Data":"92050ecc7249609ece151052166eb1fd9e90b7aac0e1c851b80520ad4e5be4ab"} Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.192645 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f","Type":"ContainerStarted","Data":"537450c763eb2de5b96da71d3437293838c3a87df2330f5cf5f2d9864091e85c"} Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.194214 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" event={"ID":"2ba0e6c3-7c71-45df-acc3-127bec5afd42","Type":"ContainerStarted","Data":"2dede691fa28e0d1a1f22827d6d512207a66f743103c9f2b622e04d1267c4b0b"} Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.204659 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csn4t" event={"ID":"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0","Type":"ContainerStarted","Data":"3a19d270666002f23c4ec0dec5dac48b73eb4ac725d66a0ed47deb1b8a32f2fa"} Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.236414 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-csn4t" podStartSLOduration=3.236381759 podStartE2EDuration="3.236381759s" podCreationTimestamp="2025-10-01 11:47:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:02.231772347 +0000 UTC m=+1173.331337324" watchObservedRunningTime="2025-10-01 11:48:02.236381759 +0000 UTC m=+1173.335946746" Oct 01 11:48:02 crc kubenswrapper[4669]: I1001 11:48:02.396285 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7d7q"] Oct 01 11:48:03 crc kubenswrapper[4669]: I1001 11:48:03.219484 4669 generic.go:334] "Generic (PLEG): container finished" podID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerID="5745348de8bf960e21b71fc9926152c5fedc6f6584a3ab80e5ac06c24ea2c00c" exitCode=0 Oct 01 11:48:03 crc kubenswrapper[4669]: I1001 11:48:03.220344 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" event={"ID":"2ba0e6c3-7c71-45df-acc3-127bec5afd42","Type":"ContainerDied","Data":"5745348de8bf960e21b71fc9926152c5fedc6f6584a3ab80e5ac06c24ea2c00c"} Oct 01 11:48:03 crc kubenswrapper[4669]: I1001 11:48:03.228939 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" event={"ID":"688eb6a7-b463-4b36-9ef7-a365cbabac1f","Type":"ContainerStarted","Data":"9c1f9528cc2e1978589e5ae1df339e0af1628a5d9c6e8153c4b394438d736741"} Oct 01 11:48:03 crc kubenswrapper[4669]: I1001 11:48:03.229009 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" event={"ID":"688eb6a7-b463-4b36-9ef7-a365cbabac1f","Type":"ContainerStarted","Data":"65ae042e921d1534da0cdef92efd87ee9651069746b3cfaae73a62f889d258c4"} Oct 01 11:48:03 crc kubenswrapper[4669]: I1001 11:48:03.273549 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" podStartSLOduration=2.273525588 podStartE2EDuration="2.273525588s" podCreationTimestamp="2025-10-01 11:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:03.266778283 +0000 UTC m=+1174.366343280" watchObservedRunningTime="2025-10-01 11:48:03.273525588 +0000 UTC m=+1174.373090565" Oct 01 11:48:04 crc kubenswrapper[4669]: I1001 11:48:04.310031 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:04 crc kubenswrapper[4669]: I1001 11:48:04.338480 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.264632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5ae9886-4487-4150-83bc-cbb211c9dea0","Type":"ContainerStarted","Data":"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed"} Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.266792 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86b8cae7-2784-4192-8b18-3cd38d6123f3","Type":"ContainerStarted","Data":"1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87"} Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.266938 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="86b8cae7-2784-4192-8b18-3cd38d6123f3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87" gracePeriod=30 Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.275578 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27a3499b-1be2-457a-adc1-896f0297fc17","Type":"ContainerStarted","Data":"9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206"} Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.277403 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f","Type":"ContainerStarted","Data":"2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23"} Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.282603 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" event={"ID":"2ba0e6c3-7c71-45df-acc3-127bec5afd42","Type":"ContainerStarted","Data":"08c22e60089a340143ab6e0e0af759c564feb95c9fe2cde22226bf2ef544c517"} Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.283277 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.289313 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.384087261 podStartE2EDuration="6.289291472s" podCreationTimestamp="2025-10-01 11:48:00 +0000 UTC" firstStartedPulling="2025-10-01 11:48:01.63437096 +0000 UTC m=+1172.733935937" lastFinishedPulling="2025-10-01 11:48:05.539575171 +0000 UTC m=+1176.639140148" observedRunningTime="2025-10-01 11:48:06.283627964 +0000 UTC m=+1177.383192941" watchObservedRunningTime="2025-10-01 11:48:06.289291472 +0000 UTC m=+1177.388856449" Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.314989 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" podStartSLOduration=6.314969008 podStartE2EDuration="6.314969008s" podCreationTimestamp="2025-10-01 11:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:06.309505305 +0000 UTC m=+1177.409070282" watchObservedRunningTime="2025-10-01 11:48:06.314969008 +0000 UTC m=+1177.414533985" Oct 01 11:48:06 crc kubenswrapper[4669]: I1001 11:48:06.343420 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.912085656 podStartE2EDuration="6.343395542s" podCreationTimestamp="2025-10-01 11:48:00 +0000 UTC" firstStartedPulling="2025-10-01 11:48:02.108295146 +0000 UTC m=+1173.207860113" lastFinishedPulling="2025-10-01 11:48:05.539605022 +0000 UTC m=+1176.639169999" observedRunningTime="2025-10-01 11:48:06.334464624 +0000 UTC m=+1177.434029601" watchObservedRunningTime="2025-10-01 11:48:06.343395542 +0000 UTC m=+1177.442960519" Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.293321 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5ae9886-4487-4150-83bc-cbb211c9dea0","Type":"ContainerStarted","Data":"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3"} Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.293823 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-log" containerID="cri-o://d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed" gracePeriod=30 Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.294564 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-metadata" containerID="cri-o://448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3" gracePeriod=30 Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.302139 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27a3499b-1be2-457a-adc1-896f0297fc17","Type":"ContainerStarted","Data":"d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb"} Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.324904 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.5062857640000002 podStartE2EDuration="7.324881784s" podCreationTimestamp="2025-10-01 11:48:00 +0000 UTC" firstStartedPulling="2025-10-01 11:48:01.757884522 +0000 UTC m=+1172.857449499" lastFinishedPulling="2025-10-01 11:48:05.576480542 +0000 UTC m=+1176.676045519" observedRunningTime="2025-10-01 11:48:07.323792827 +0000 UTC m=+1178.423357824" watchObservedRunningTime="2025-10-01 11:48:07.324881784 +0000 UTC m=+1178.424446761" Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.353044 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.236567527 podStartE2EDuration="7.35302603s" podCreationTimestamp="2025-10-01 11:48:00 +0000 UTC" firstStartedPulling="2025-10-01 11:48:01.422451323 +0000 UTC m=+1172.522016300" lastFinishedPulling="2025-10-01 11:48:05.538909826 +0000 UTC m=+1176.638474803" observedRunningTime="2025-10-01 11:48:07.352890667 +0000 UTC m=+1178.452455654" watchObservedRunningTime="2025-10-01 11:48:07.35302603 +0000 UTC m=+1178.452591007" Oct 01 11:48:07 crc kubenswrapper[4669]: I1001 11:48:07.890833 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.066515 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-combined-ca-bundle\") pod \"d5ae9886-4487-4150-83bc-cbb211c9dea0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.066746 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn4ml\" (UniqueName: \"kubernetes.io/projected/d5ae9886-4487-4150-83bc-cbb211c9dea0-kube-api-access-kn4ml\") pod \"d5ae9886-4487-4150-83bc-cbb211c9dea0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.066847 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae9886-4487-4150-83bc-cbb211c9dea0-logs\") pod \"d5ae9886-4487-4150-83bc-cbb211c9dea0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.066990 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-config-data\") pod \"d5ae9886-4487-4150-83bc-cbb211c9dea0\" (UID: \"d5ae9886-4487-4150-83bc-cbb211c9dea0\") " Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.069512 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ae9886-4487-4150-83bc-cbb211c9dea0-logs" (OuterVolumeSpecName: "logs") pod "d5ae9886-4487-4150-83bc-cbb211c9dea0" (UID: "d5ae9886-4487-4150-83bc-cbb211c9dea0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.094289 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ae9886-4487-4150-83bc-cbb211c9dea0-kube-api-access-kn4ml" (OuterVolumeSpecName: "kube-api-access-kn4ml") pod "d5ae9886-4487-4150-83bc-cbb211c9dea0" (UID: "d5ae9886-4487-4150-83bc-cbb211c9dea0"). InnerVolumeSpecName "kube-api-access-kn4ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.135501 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-config-data" (OuterVolumeSpecName: "config-data") pod "d5ae9886-4487-4150-83bc-cbb211c9dea0" (UID: "d5ae9886-4487-4150-83bc-cbb211c9dea0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.149312 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ae9886-4487-4150-83bc-cbb211c9dea0" (UID: "d5ae9886-4487-4150-83bc-cbb211c9dea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.172409 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn4ml\" (UniqueName: \"kubernetes.io/projected/d5ae9886-4487-4150-83bc-cbb211c9dea0-kube-api-access-kn4ml\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.172456 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae9886-4487-4150-83bc-cbb211c9dea0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.172467 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.172477 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae9886-4487-4150-83bc-cbb211c9dea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.312467 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerID="448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3" exitCode=0 Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.312508 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerID="d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed" exitCode=143 Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.313609 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.315317 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5ae9886-4487-4150-83bc-cbb211c9dea0","Type":"ContainerDied","Data":"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3"} Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.315434 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5ae9886-4487-4150-83bc-cbb211c9dea0","Type":"ContainerDied","Data":"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed"} Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.315448 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5ae9886-4487-4150-83bc-cbb211c9dea0","Type":"ContainerDied","Data":"202498e1b27ea0427b42961da482596f65947fd4f8adeb9b0488f2b34a8140a2"} Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.315470 4669 scope.go:117] "RemoveContainer" containerID="448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.346615 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.349695 4669 scope.go:117] "RemoveContainer" containerID="d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.356594 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.381501 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:08 crc kubenswrapper[4669]: E1001 11:48:08.382156 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-metadata" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.382181 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-metadata" Oct 01 11:48:08 crc kubenswrapper[4669]: E1001 11:48:08.382201 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-log" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.382230 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-log" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.382508 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-log" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.382530 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" containerName="nova-metadata-metadata" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.385345 4669 scope.go:117] "RemoveContainer" containerID="448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3" Oct 01 11:48:08 crc kubenswrapper[4669]: E1001 11:48:08.386273 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3\": container with ID starting with 448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3 not found: ID does not exist" containerID="448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.386337 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3"} err="failed to get container status \"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3\": rpc error: code = NotFound desc = could not find container \"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3\": container with ID starting with 448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3 not found: ID does not exist" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.386388 4669 scope.go:117] "RemoveContainer" containerID="d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed" Oct 01 11:48:08 crc kubenswrapper[4669]: E1001 11:48:08.386913 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed\": container with ID starting with d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed not found: ID does not exist" containerID="d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.386945 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed"} err="failed to get container status \"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed\": rpc error: code = NotFound desc = could not find container \"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed\": container with ID starting with d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed not found: ID does not exist" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.386969 4669 scope.go:117] "RemoveContainer" containerID="448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.387901 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3"} err="failed to get container status \"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3\": rpc error: code = NotFound desc = could not find container \"448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3\": container with ID starting with 448fba0d9b2a80f207781f6a25e8b17786fb146908dac98738efd0611a25daf3 not found: ID does not exist" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.387938 4669 scope.go:117] "RemoveContainer" containerID="d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.388426 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed"} err="failed to get container status \"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed\": rpc error: code = NotFound desc = could not find container \"d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed\": container with ID starting with d4aaba74281fd0a51e04e159d31f3422213ecb5362f3f06fd8a28f4c4d3b37ed not found: ID does not exist" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.389512 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.393273 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.393519 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.429430 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.580566 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b29142-6359-467e-b01f-5c2615146b2c-logs\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.581330 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.581502 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-config-data\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.581704 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.581854 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgrd\" (UniqueName: \"kubernetes.io/projected/22b29142-6359-467e-b01f-5c2615146b2c-kube-api-access-hmgrd\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.684029 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b29142-6359-467e-b01f-5c2615146b2c-logs\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.684163 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.684238 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-config-data\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.684345 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.684411 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgrd\" (UniqueName: \"kubernetes.io/projected/22b29142-6359-467e-b01f-5c2615146b2c-kube-api-access-hmgrd\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.685894 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b29142-6359-467e-b01f-5c2615146b2c-logs\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.692336 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.692657 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-config-data\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.693826 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.703241 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgrd\" (UniqueName: \"kubernetes.io/projected/22b29142-6359-467e-b01f-5c2615146b2c-kube-api-access-hmgrd\") pod \"nova-metadata-0\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " pod="openstack/nova-metadata-0" Oct 01 11:48:08 crc kubenswrapper[4669]: I1001 11:48:08.724430 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:09 crc kubenswrapper[4669]: I1001 11:48:09.282607 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:09 crc kubenswrapper[4669]: I1001 11:48:09.329743 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22b29142-6359-467e-b01f-5c2615146b2c","Type":"ContainerStarted","Data":"fa154f64170be63ad81914eb5a6c34343a445871b99bd9da3d21d5d8a9f3e287"} Oct 01 11:48:09 crc kubenswrapper[4669]: I1001 11:48:09.658311 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ae9886-4487-4150-83bc-cbb211c9dea0" path="/var/lib/kubelet/pods/d5ae9886-4487-4150-83bc-cbb211c9dea0/volumes" Oct 01 11:48:10 crc kubenswrapper[4669]: I1001 11:48:10.358989 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22b29142-6359-467e-b01f-5c2615146b2c","Type":"ContainerStarted","Data":"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f"} Oct 01 11:48:10 crc kubenswrapper[4669]: I1001 11:48:10.359064 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22b29142-6359-467e-b01f-5c2615146b2c","Type":"ContainerStarted","Data":"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e"} Oct 01 11:48:10 crc kubenswrapper[4669]: I1001 11:48:10.401301 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.401275986 podStartE2EDuration="2.401275986s" podCreationTimestamp="2025-10-01 11:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:10.389332805 +0000 UTC m=+1181.488897782" watchObservedRunningTime="2025-10-01 11:48:10.401275986 +0000 UTC m=+1181.500840963" Oct 01 11:48:10 crc kubenswrapper[4669]: I1001 11:48:10.562320 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:48:10 crc kubenswrapper[4669]: I1001 11:48:10.562409 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:48:10 crc kubenswrapper[4669]: I1001 11:48:10.591599 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.193568 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.194466 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.252825 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.374721 4669 generic.go:334] "Generic (PLEG): container finished" podID="9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" containerID="3a19d270666002f23c4ec0dec5dac48b73eb4ac725d66a0ed47deb1b8a32f2fa" exitCode=0 Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.374832 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csn4t" event={"ID":"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0","Type":"ContainerDied","Data":"3a19d270666002f23c4ec0dec5dac48b73eb4ac725d66a0ed47deb1b8a32f2fa"} Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.431974 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.645325 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:11 crc kubenswrapper[4669]: I1001 11:48:11.645487 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:12 crc kubenswrapper[4669]: I1001 11:48:12.849536 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:12 crc kubenswrapper[4669]: I1001 11:48:12.990049 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8nm2\" (UniqueName: \"kubernetes.io/projected/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-kube-api-access-l8nm2\") pod \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " Oct 01 11:48:12 crc kubenswrapper[4669]: I1001 11:48:12.990229 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-config-data\") pod \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " Oct 01 11:48:12 crc kubenswrapper[4669]: I1001 11:48:12.990413 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-combined-ca-bundle\") pod \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " Oct 01 11:48:12 crc kubenswrapper[4669]: I1001 11:48:12.990497 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-scripts\") pod \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\" (UID: \"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0\") " Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.011909 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-kube-api-access-l8nm2" (OuterVolumeSpecName: "kube-api-access-l8nm2") pod "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" (UID: "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0"). InnerVolumeSpecName "kube-api-access-l8nm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.012061 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-scripts" (OuterVolumeSpecName: "scripts") pod "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" (UID: "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.027529 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" (UID: "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.041365 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-config-data" (OuterVolumeSpecName: "config-data") pod "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" (UID: "9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.096324 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8nm2\" (UniqueName: \"kubernetes.io/projected/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-kube-api-access-l8nm2\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.096377 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.096395 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.096417 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.338982 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.412035 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csn4t" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.414102 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csn4t" event={"ID":"9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0","Type":"ContainerDied","Data":"60d2b8d95fc24247f3f683240000bf50c5d07649995861bb1f36fc96cac80d10"} Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.414198 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60d2b8d95fc24247f3f683240000bf50c5d07649995861bb1f36fc96cac80d10" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.600410 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.600743 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-log" containerID="cri-o://9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206" gracePeriod=30 Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.600891 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-api" containerID="cri-o://d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb" gracePeriod=30 Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.619903 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.657991 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.658767 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-log" containerID="cri-o://e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e" gracePeriod=30 Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.658920 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-metadata" containerID="cri-o://fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f" gracePeriod=30 Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.725248 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 11:48:13 crc kubenswrapper[4669]: I1001 11:48:13.725333 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.144743 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.328080 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b29142-6359-467e-b01f-5c2615146b2c-logs\") pod \"22b29142-6359-467e-b01f-5c2615146b2c\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.328459 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmgrd\" (UniqueName: \"kubernetes.io/projected/22b29142-6359-467e-b01f-5c2615146b2c-kube-api-access-hmgrd\") pod \"22b29142-6359-467e-b01f-5c2615146b2c\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.328534 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-nova-metadata-tls-certs\") pod \"22b29142-6359-467e-b01f-5c2615146b2c\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.328763 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-combined-ca-bundle\") pod \"22b29142-6359-467e-b01f-5c2615146b2c\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.328890 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-config-data\") pod \"22b29142-6359-467e-b01f-5c2615146b2c\" (UID: \"22b29142-6359-467e-b01f-5c2615146b2c\") " Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.328948 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b29142-6359-467e-b01f-5c2615146b2c-logs" (OuterVolumeSpecName: "logs") pod "22b29142-6359-467e-b01f-5c2615146b2c" (UID: "22b29142-6359-467e-b01f-5c2615146b2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.329402 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22b29142-6359-467e-b01f-5c2615146b2c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.346189 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b29142-6359-467e-b01f-5c2615146b2c-kube-api-access-hmgrd" (OuterVolumeSpecName: "kube-api-access-hmgrd") pod "22b29142-6359-467e-b01f-5c2615146b2c" (UID: "22b29142-6359-467e-b01f-5c2615146b2c"). InnerVolumeSpecName "kube-api-access-hmgrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.358137 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-config-data" (OuterVolumeSpecName: "config-data") pod "22b29142-6359-467e-b01f-5c2615146b2c" (UID: "22b29142-6359-467e-b01f-5c2615146b2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.361192 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22b29142-6359-467e-b01f-5c2615146b2c" (UID: "22b29142-6359-467e-b01f-5c2615146b2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.414021 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "22b29142-6359-467e-b01f-5c2615146b2c" (UID: "22b29142-6359-467e-b01f-5c2615146b2c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.428979 4669 generic.go:334] "Generic (PLEG): container finished" podID="22b29142-6359-467e-b01f-5c2615146b2c" containerID="fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f" exitCode=0 Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.429031 4669 generic.go:334] "Generic (PLEG): container finished" podID="22b29142-6359-467e-b01f-5c2615146b2c" containerID="e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e" exitCode=143 Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.429059 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.429127 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22b29142-6359-467e-b01f-5c2615146b2c","Type":"ContainerDied","Data":"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f"} Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.429273 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22b29142-6359-467e-b01f-5c2615146b2c","Type":"ContainerDied","Data":"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e"} Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.429308 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22b29142-6359-467e-b01f-5c2615146b2c","Type":"ContainerDied","Data":"fa154f64170be63ad81914eb5a6c34343a445871b99bd9da3d21d5d8a9f3e287"} Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.429317 4669 scope.go:117] "RemoveContainer" containerID="fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.436850 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmgrd\" (UniqueName: \"kubernetes.io/projected/22b29142-6359-467e-b01f-5c2615146b2c-kube-api-access-hmgrd\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.436923 4669 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.436953 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.436980 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22b29142-6359-467e-b01f-5c2615146b2c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.442275 4669 generic.go:334] "Generic (PLEG): container finished" podID="27a3499b-1be2-457a-adc1-896f0297fc17" containerID="9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206" exitCode=143 Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.442509 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27a3499b-1be2-457a-adc1-896f0297fc17","Type":"ContainerDied","Data":"9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206"} Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.448272 4669 generic.go:334] "Generic (PLEG): container finished" podID="688eb6a7-b463-4b36-9ef7-a365cbabac1f" containerID="9c1f9528cc2e1978589e5ae1df339e0af1628a5d9c6e8153c4b394438d736741" exitCode=0 Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.448369 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" event={"ID":"688eb6a7-b463-4b36-9ef7-a365cbabac1f","Type":"ContainerDied","Data":"9c1f9528cc2e1978589e5ae1df339e0af1628a5d9c6e8153c4b394438d736741"} Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.448511 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" containerName="nova-scheduler-scheduler" containerID="cri-o://2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" gracePeriod=30 Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.485110 4669 scope.go:117] "RemoveContainer" containerID="e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.508182 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.522667 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.527376 4669 scope.go:117] "RemoveContainer" containerID="fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f" Oct 01 11:48:14 crc kubenswrapper[4669]: E1001 11:48:14.527873 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f\": container with ID starting with fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f not found: ID does not exist" containerID="fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.527911 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f"} err="failed to get container status \"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f\": rpc error: code = NotFound desc = could not find container \"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f\": container with ID starting with fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f not found: ID does not exist" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.527937 4669 scope.go:117] "RemoveContainer" containerID="e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e" Oct 01 11:48:14 crc kubenswrapper[4669]: E1001 11:48:14.528250 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e\": container with ID starting with e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e not found: ID does not exist" containerID="e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.528310 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e"} err="failed to get container status \"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e\": rpc error: code = NotFound desc = could not find container \"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e\": container with ID starting with e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e not found: ID does not exist" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.528331 4669 scope.go:117] "RemoveContainer" containerID="fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.528848 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f"} err="failed to get container status \"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f\": rpc error: code = NotFound desc = could not find container \"fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f\": container with ID starting with fbf4b6ee9f545c36ef84e9718b02ca292fe90b91710137205d21bcef27da839f not found: ID does not exist" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.528869 4669 scope.go:117] "RemoveContainer" containerID="e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.529151 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e"} err="failed to get container status \"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e\": rpc error: code = NotFound desc = could not find container \"e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e\": container with ID starting with e5aba977afb4135d91a85910b1881dfaf3e4e3da26bec9206ffc9abd4254266e not found: ID does not exist" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.532320 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:14 crc kubenswrapper[4669]: E1001 11:48:14.532857 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-metadata" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.532878 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-metadata" Oct 01 11:48:14 crc kubenswrapper[4669]: E1001 11:48:14.532902 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-log" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.532908 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-log" Oct 01 11:48:14 crc kubenswrapper[4669]: E1001 11:48:14.532936 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" containerName="nova-manage" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.532943 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" containerName="nova-manage" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.533174 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-metadata" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.533201 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b29142-6359-467e-b01f-5c2615146b2c" containerName="nova-metadata-log" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.533210 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" containerName="nova-manage" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.534358 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.541461 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.599316 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.600118 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.642303 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjp2\" (UniqueName: \"kubernetes.io/projected/285b539b-1b0c-4bb1-a197-ca28afe29810-kube-api-access-hzjp2\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.642386 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.642414 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-config-data\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.642441 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285b539b-1b0c-4bb1-a197-ca28afe29810-logs\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.642567 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.744575 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.744667 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjp2\" (UniqueName: \"kubernetes.io/projected/285b539b-1b0c-4bb1-a197-ca28afe29810-kube-api-access-hzjp2\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.744730 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.744773 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-config-data\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.744812 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285b539b-1b0c-4bb1-a197-ca28afe29810-logs\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.745662 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285b539b-1b0c-4bb1-a197-ca28afe29810-logs\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.750179 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-config-data\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.750600 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.751987 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.764226 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjp2\" (UniqueName: \"kubernetes.io/projected/285b539b-1b0c-4bb1-a197-ca28afe29810-kube-api-access-hzjp2\") pod \"nova-metadata-0\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " pod="openstack/nova-metadata-0" Oct 01 11:48:14 crc kubenswrapper[4669]: I1001 11:48:14.947504 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:48:15 crc kubenswrapper[4669]: I1001 11:48:15.473996 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:15 crc kubenswrapper[4669]: I1001 11:48:15.662679 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b29142-6359-467e-b01f-5c2615146b2c" path="/var/lib/kubelet/pods/22b29142-6359-467e-b01f-5c2615146b2c/volumes" Oct 01 11:48:15 crc kubenswrapper[4669]: I1001 11:48:15.937417 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:15 crc kubenswrapper[4669]: I1001 11:48:15.959389 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.027785 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-scripts\") pod \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.027846 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flvbz\" (UniqueName: \"kubernetes.io/projected/688eb6a7-b463-4b36-9ef7-a365cbabac1f-kube-api-access-flvbz\") pod \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.043417 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-scripts" (OuterVolumeSpecName: "scripts") pod "688eb6a7-b463-4b36-9ef7-a365cbabac1f" (UID: "688eb6a7-b463-4b36-9ef7-a365cbabac1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.069370 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688eb6a7-b463-4b36-9ef7-a365cbabac1f-kube-api-access-flvbz" (OuterVolumeSpecName: "kube-api-access-flvbz") pod "688eb6a7-b463-4b36-9ef7-a365cbabac1f" (UID: "688eb6a7-b463-4b36-9ef7-a365cbabac1f"). InnerVolumeSpecName "kube-api-access-flvbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.084173 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w5zw6"] Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.084521 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerName="dnsmasq-dns" containerID="cri-o://a0680eea8c03ebff6a0993e3f57b30a42388ff869bfb969138f837828631b85c" gracePeriod=10 Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.141099 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-config-data\") pod \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.141420 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-combined-ca-bundle\") pod \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\" (UID: \"688eb6a7-b463-4b36-9ef7-a365cbabac1f\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.141890 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.141910 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flvbz\" (UniqueName: \"kubernetes.io/projected/688eb6a7-b463-4b36-9ef7-a365cbabac1f-kube-api-access-flvbz\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.182606 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "688eb6a7-b463-4b36-9ef7-a365cbabac1f" (UID: "688eb6a7-b463-4b36-9ef7-a365cbabac1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.199908 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-config-data" (OuterVolumeSpecName: "config-data") pod "688eb6a7-b463-4b36-9ef7-a365cbabac1f" (UID: "688eb6a7-b463-4b36-9ef7-a365cbabac1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: E1001 11:48:16.202681 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 11:48:16 crc kubenswrapper[4669]: E1001 11:48:16.212651 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 11:48:16 crc kubenswrapper[4669]: E1001 11:48:16.214887 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 11:48:16 crc kubenswrapper[4669]: E1001 11:48:16.214957 4669 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" containerName="nova-scheduler-scheduler" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.245133 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.245181 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688eb6a7-b463-4b36-9ef7-a365cbabac1f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.522509 4669 generic.go:334] "Generic (PLEG): container finished" podID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerID="a0680eea8c03ebff6a0993e3f57b30a42388ff869bfb969138f837828631b85c" exitCode=0 Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.522646 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" event={"ID":"af3be2a0-85e4-4833-89c9-4450ee2e5635","Type":"ContainerDied","Data":"a0680eea8c03ebff6a0993e3f57b30a42388ff869bfb969138f837828631b85c"} Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.555687 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" event={"ID":"688eb6a7-b463-4b36-9ef7-a365cbabac1f","Type":"ContainerDied","Data":"65ae042e921d1534da0cdef92efd87ee9651069746b3cfaae73a62f889d258c4"} Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.555759 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65ae042e921d1534da0cdef92efd87ee9651069746b3cfaae73a62f889d258c4" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.555910 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7d7q" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.579240 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"285b539b-1b0c-4bb1-a197-ca28afe29810","Type":"ContainerStarted","Data":"f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6"} Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.579304 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"285b539b-1b0c-4bb1-a197-ca28afe29810","Type":"ContainerStarted","Data":"0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3"} Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.579318 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"285b539b-1b0c-4bb1-a197-ca28afe29810","Type":"ContainerStarted","Data":"0a26f6631df18abbc611063e92db5f0449f4ccc1aa90c288ae1bb9339c0dfbeb"} Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.637394 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 11:48:16 crc kubenswrapper[4669]: E1001 11:48:16.638012 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688eb6a7-b463-4b36-9ef7-a365cbabac1f" containerName="nova-cell1-conductor-db-sync" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.638028 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="688eb6a7-b463-4b36-9ef7-a365cbabac1f" containerName="nova-cell1-conductor-db-sync" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.638278 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="688eb6a7-b463-4b36-9ef7-a365cbabac1f" containerName="nova-cell1-conductor-db-sync" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.639167 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.651432 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.662918 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjtg\" (UniqueName: \"kubernetes.io/projected/2266ee85-7b31-496a-9dbd-6d69e282e847-kube-api-access-rkjtg\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.662977 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2266ee85-7b31-496a-9dbd-6d69e282e847-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.663066 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2266ee85-7b31-496a-9dbd-6d69e282e847-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.668698 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.668814 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6687801479999997 podStartE2EDuration="2.668780148s" podCreationTimestamp="2025-10-01 11:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:16.625726718 +0000 UTC m=+1187.725291695" watchObservedRunningTime="2025-10-01 11:48:16.668780148 +0000 UTC m=+1187.768345125" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.705646 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.765120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjtg\" (UniqueName: \"kubernetes.io/projected/2266ee85-7b31-496a-9dbd-6d69e282e847-kube-api-access-rkjtg\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.765184 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2266ee85-7b31-496a-9dbd-6d69e282e847-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.765284 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2266ee85-7b31-496a-9dbd-6d69e282e847-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.773930 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2266ee85-7b31-496a-9dbd-6d69e282e847-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.774387 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2266ee85-7b31-496a-9dbd-6d69e282e847-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.786677 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjtg\" (UniqueName: \"kubernetes.io/projected/2266ee85-7b31-496a-9dbd-6d69e282e847-kube-api-access-rkjtg\") pod \"nova-cell1-conductor-0\" (UID: \"2266ee85-7b31-496a-9dbd-6d69e282e847\") " pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.867426 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4w8g\" (UniqueName: \"kubernetes.io/projected/af3be2a0-85e4-4833-89c9-4450ee2e5635-kube-api-access-t4w8g\") pod \"af3be2a0-85e4-4833-89c9-4450ee2e5635\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.867529 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-config\") pod \"af3be2a0-85e4-4833-89c9-4450ee2e5635\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.867599 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-svc\") pod \"af3be2a0-85e4-4833-89c9-4450ee2e5635\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.867684 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-sb\") pod \"af3be2a0-85e4-4833-89c9-4450ee2e5635\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.867735 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-swift-storage-0\") pod \"af3be2a0-85e4-4833-89c9-4450ee2e5635\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.867758 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-nb\") pod \"af3be2a0-85e4-4833-89c9-4450ee2e5635\" (UID: \"af3be2a0-85e4-4833-89c9-4450ee2e5635\") " Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.873763 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3be2a0-85e4-4833-89c9-4450ee2e5635-kube-api-access-t4w8g" (OuterVolumeSpecName: "kube-api-access-t4w8g") pod "af3be2a0-85e4-4833-89c9-4450ee2e5635" (UID: "af3be2a0-85e4-4833-89c9-4450ee2e5635"). InnerVolumeSpecName "kube-api-access-t4w8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.941684 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af3be2a0-85e4-4833-89c9-4450ee2e5635" (UID: "af3be2a0-85e4-4833-89c9-4450ee2e5635"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.942441 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af3be2a0-85e4-4833-89c9-4450ee2e5635" (UID: "af3be2a0-85e4-4833-89c9-4450ee2e5635"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.944579 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-config" (OuterVolumeSpecName: "config") pod "af3be2a0-85e4-4833-89c9-4450ee2e5635" (UID: "af3be2a0-85e4-4833-89c9-4450ee2e5635"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.967734 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af3be2a0-85e4-4833-89c9-4450ee2e5635" (UID: "af3be2a0-85e4-4833-89c9-4450ee2e5635"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.970452 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.970488 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.970501 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4w8g\" (UniqueName: \"kubernetes.io/projected/af3be2a0-85e4-4833-89c9-4450ee2e5635-kube-api-access-t4w8g\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.970516 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:16 crc kubenswrapper[4669]: I1001 11:48:16.970524 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.005828 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.012467 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af3be2a0-85e4-4833-89c9-4450ee2e5635" (UID: "af3be2a0-85e4-4833-89c9-4450ee2e5635"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.072384 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3be2a0-85e4-4833-89c9-4450ee2e5635-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.214532 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.377039 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a3499b-1be2-457a-adc1-896f0297fc17-logs\") pod \"27a3499b-1be2-457a-adc1-896f0297fc17\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.377124 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-config-data\") pod \"27a3499b-1be2-457a-adc1-896f0297fc17\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.377173 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shpfw\" (UniqueName: \"kubernetes.io/projected/27a3499b-1be2-457a-adc1-896f0297fc17-kube-api-access-shpfw\") pod \"27a3499b-1be2-457a-adc1-896f0297fc17\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.377249 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-combined-ca-bundle\") pod \"27a3499b-1be2-457a-adc1-896f0297fc17\" (UID: \"27a3499b-1be2-457a-adc1-896f0297fc17\") " Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.382914 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a3499b-1be2-457a-adc1-896f0297fc17-logs" (OuterVolumeSpecName: "logs") pod "27a3499b-1be2-457a-adc1-896f0297fc17" (UID: "27a3499b-1be2-457a-adc1-896f0297fc17"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.385492 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a3499b-1be2-457a-adc1-896f0297fc17-kube-api-access-shpfw" (OuterVolumeSpecName: "kube-api-access-shpfw") pod "27a3499b-1be2-457a-adc1-896f0297fc17" (UID: "27a3499b-1be2-457a-adc1-896f0297fc17"). InnerVolumeSpecName "kube-api-access-shpfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.411458 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27a3499b-1be2-457a-adc1-896f0297fc17" (UID: "27a3499b-1be2-457a-adc1-896f0297fc17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.412017 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-config-data" (OuterVolumeSpecName: "config-data") pod "27a3499b-1be2-457a-adc1-896f0297fc17" (UID: "27a3499b-1be2-457a-adc1-896f0297fc17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.479864 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a3499b-1be2-457a-adc1-896f0297fc17-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.479904 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.479919 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shpfw\" (UniqueName: \"kubernetes.io/projected/27a3499b-1be2-457a-adc1-896f0297fc17-kube-api-access-shpfw\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.479933 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a3499b-1be2-457a-adc1-896f0297fc17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.518146 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 11:48:17 crc kubenswrapper[4669]: W1001 11:48:17.522258 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2266ee85_7b31_496a_9dbd_6d69e282e847.slice/crio-bb06cb49f332cec89616f4b3c3a3498ebd202c8fe99a64bfe71c71df6609140d WatchSource:0}: Error finding container bb06cb49f332cec89616f4b3c3a3498ebd202c8fe99a64bfe71c71df6609140d: Status 404 returned error can't find the container with id bb06cb49f332cec89616f4b3c3a3498ebd202c8fe99a64bfe71c71df6609140d Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.591976 4669 generic.go:334] "Generic (PLEG): container finished" podID="27a3499b-1be2-457a-adc1-896f0297fc17" containerID="d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb" exitCode=0 Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.592097 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.592110 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27a3499b-1be2-457a-adc1-896f0297fc17","Type":"ContainerDied","Data":"d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb"} Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.592166 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27a3499b-1be2-457a-adc1-896f0297fc17","Type":"ContainerDied","Data":"92050ecc7249609ece151052166eb1fd9e90b7aac0e1c851b80520ad4e5be4ab"} Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.592188 4669 scope.go:117] "RemoveContainer" containerID="d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.594194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2266ee85-7b31-496a-9dbd-6d69e282e847","Type":"ContainerStarted","Data":"bb06cb49f332cec89616f4b3c3a3498ebd202c8fe99a64bfe71c71df6609140d"} Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.598588 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" event={"ID":"af3be2a0-85e4-4833-89c9-4450ee2e5635","Type":"ContainerDied","Data":"08d006677944b8b465e92beaeeee845682387e7e5f54b447974e8c0de3148786"} Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.598674 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-w5zw6" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.632456 4669 scope.go:117] "RemoveContainer" containerID="9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.636428 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.673660 4669 scope.go:117] "RemoveContainer" containerID="d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb" Oct 01 11:48:17 crc kubenswrapper[4669]: E1001 11:48:17.674415 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb\": container with ID starting with d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb not found: ID does not exist" containerID="d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.674571 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb"} err="failed to get container status \"d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb\": rpc error: code = NotFound desc = could not find container \"d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb\": container with ID starting with d12686f4486d742e63470456eec8091d37469ae8e02ab6549da8918dfb2044cb not found: ID does not exist" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.674664 4669 scope.go:117] "RemoveContainer" containerID="9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206" Oct 01 11:48:17 crc kubenswrapper[4669]: E1001 11:48:17.677588 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206\": container with ID starting with 9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206 not found: ID does not exist" containerID="9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.677672 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206"} err="failed to get container status \"9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206\": rpc error: code = NotFound desc = could not find container \"9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206\": container with ID starting with 9aa7376e208319492c106adb64456fb2cc5d44f5012456a96580a3e2e898a206 not found: ID does not exist" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.677723 4669 scope.go:117] "RemoveContainer" containerID="a0680eea8c03ebff6a0993e3f57b30a42388ff869bfb969138f837828631b85c" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.679870 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.679913 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w5zw6"] Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.686939 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:17 crc kubenswrapper[4669]: E1001 11:48:17.687711 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerName="dnsmasq-dns" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.687737 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerName="dnsmasq-dns" Oct 01 11:48:17 crc kubenswrapper[4669]: E1001 11:48:17.687750 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerName="init" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.687770 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerName="init" Oct 01 11:48:17 crc kubenswrapper[4669]: E1001 11:48:17.687792 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-log" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.687801 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-log" Oct 01 11:48:17 crc kubenswrapper[4669]: E1001 11:48:17.687830 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-api" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.687842 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-api" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.688637 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-api" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.689030 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" containerName="dnsmasq-dns" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.689049 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" containerName="nova-api-log" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.690397 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.702588 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.714537 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-w5zw6"] Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.715131 4669 scope.go:117] "RemoveContainer" containerID="8b6efcae2431858e2175af34979ae7bc0ee9a353088e486520fafabf8b37dc0c" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.725831 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.786147 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhn8\" (UniqueName: \"kubernetes.io/projected/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-kube-api-access-nrhn8\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.786260 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-config-data\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.786327 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.786379 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-logs\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.888460 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-config-data\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.889262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.889749 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-logs\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.889974 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhn8\" (UniqueName: \"kubernetes.io/projected/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-kube-api-access-nrhn8\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.890228 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-logs\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.893116 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.894680 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-config-data\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:17 crc kubenswrapper[4669]: I1001 11:48:17.910585 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhn8\" (UniqueName: \"kubernetes.io/projected/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-kube-api-access-nrhn8\") pod \"nova-api-0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " pod="openstack/nova-api-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.032014 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.151980 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.152873 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9d832718-661a-44fb-bcc8-7f48af908b15" containerName="kube-state-metrics" containerID="cri-o://d7e3f25f398c9230aa728260c9f128890931fd534923084cf3e18d565b8c6014" gracePeriod=30 Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.476377 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.606840 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-combined-ca-bundle\") pod \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.606983 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-config-data\") pod \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.607196 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hkxf\" (UniqueName: \"kubernetes.io/projected/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-kube-api-access-6hkxf\") pod \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\" (UID: \"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f\") " Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.614645 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-kube-api-access-6hkxf" (OuterVolumeSpecName: "kube-api-access-6hkxf") pod "2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" (UID: "2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f"). InnerVolumeSpecName "kube-api-access-6hkxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.618488 4669 generic.go:334] "Generic (PLEG): container finished" podID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" exitCode=0 Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.618623 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.619510 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f","Type":"ContainerDied","Data":"2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23"} Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.619549 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f","Type":"ContainerDied","Data":"537450c763eb2de5b96da71d3437293838c3a87df2330f5cf5f2d9864091e85c"} Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.619568 4669 scope.go:117] "RemoveContainer" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.625793 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d832718-661a-44fb-bcc8-7f48af908b15" containerID="d7e3f25f398c9230aa728260c9f128890931fd534923084cf3e18d565b8c6014" exitCode=2 Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.625859 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d832718-661a-44fb-bcc8-7f48af908b15","Type":"ContainerDied","Data":"d7e3f25f398c9230aa728260c9f128890931fd534923084cf3e18d565b8c6014"} Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.631541 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2266ee85-7b31-496a-9dbd-6d69e282e847","Type":"ContainerStarted","Data":"630b40dec62af475d4d34ada45a3451a8c3a7cd2a66dfba20372f7d7a979452a"} Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.632935 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.644980 4669 scope.go:117] "RemoveContainer" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" Oct 01 11:48:18 crc kubenswrapper[4669]: E1001 11:48:18.647070 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23\": container with ID starting with 2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23 not found: ID does not exist" containerID="2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.647250 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23"} err="failed to get container status \"2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23\": rpc error: code = NotFound desc = could not find container \"2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23\": container with ID starting with 2fe47f61c66bd96951eae8db073d9de2f29812063b5cd31cccbaf3634bb99b23 not found: ID does not exist" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.651237 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" (UID: "2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.664375 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.664347116 podStartE2EDuration="2.664347116s" podCreationTimestamp="2025-10-01 11:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:18.655322746 +0000 UTC m=+1189.754887723" watchObservedRunningTime="2025-10-01 11:48:18.664347116 +0000 UTC m=+1189.763912093" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.676941 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-config-data" (OuterVolumeSpecName: "config-data") pod "2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" (UID: "2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.689842 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.702476 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.712972 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hkxf\" (UniqueName: \"kubernetes.io/projected/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-kube-api-access-6hkxf\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.713025 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.713038 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.814546 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn2pf\" (UniqueName: \"kubernetes.io/projected/9d832718-661a-44fb-bcc8-7f48af908b15-kube-api-access-tn2pf\") pod \"9d832718-661a-44fb-bcc8-7f48af908b15\" (UID: \"9d832718-661a-44fb-bcc8-7f48af908b15\") " Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.821397 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d832718-661a-44fb-bcc8-7f48af908b15-kube-api-access-tn2pf" (OuterVolumeSpecName: "kube-api-access-tn2pf") pod "9d832718-661a-44fb-bcc8-7f48af908b15" (UID: "9d832718-661a-44fb-bcc8-7f48af908b15"). InnerVolumeSpecName "kube-api-access-tn2pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.916934 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn2pf\" (UniqueName: \"kubernetes.io/projected/9d832718-661a-44fb-bcc8-7f48af908b15-kube-api-access-tn2pf\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.951719 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.964167 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.977908 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:18 crc kubenswrapper[4669]: E1001 11:48:18.978426 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" containerName="nova-scheduler-scheduler" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.978447 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" containerName="nova-scheduler-scheduler" Oct 01 11:48:18 crc kubenswrapper[4669]: E1001 11:48:18.978469 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d832718-661a-44fb-bcc8-7f48af908b15" containerName="kube-state-metrics" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.978480 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d832718-661a-44fb-bcc8-7f48af908b15" containerName="kube-state-metrics" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.978696 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" containerName="nova-scheduler-scheduler" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.978715 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d832718-661a-44fb-bcc8-7f48af908b15" containerName="kube-state-metrics" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.986355 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.993184 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 11:48:18 crc kubenswrapper[4669]: I1001 11:48:18.993499 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.121022 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.121197 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-config-data\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.121296 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x86vm\" (UniqueName: \"kubernetes.io/projected/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-kube-api-access-x86vm\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.222969 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.223056 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-config-data\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.223146 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x86vm\" (UniqueName: \"kubernetes.io/projected/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-kube-api-access-x86vm\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.228956 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-config-data\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.229640 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.241696 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x86vm\" (UniqueName: \"kubernetes.io/projected/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-kube-api-access-x86vm\") pod \"nova-scheduler-0\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.305806 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.652610 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.673300 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a3499b-1be2-457a-adc1-896f0297fc17" path="/var/lib/kubelet/pods/27a3499b-1be2-457a-adc1-896f0297fc17/volumes" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.673944 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f" path="/var/lib/kubelet/pods/2de80b6b-cc4f-4bb0-a331-53b2a6c0a64f/volumes" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.674508 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3be2a0-85e4-4833-89c9-4450ee2e5635" path="/var/lib/kubelet/pods/af3be2a0-85e4-4833-89c9-4450ee2e5635/volumes" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.676269 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d832718-661a-44fb-bcc8-7f48af908b15","Type":"ContainerDied","Data":"c2199de0a4468222ad06e6c8509deff730da8185a2e350a8c2d0de967188b3f7"} Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.676309 4669 scope.go:117] "RemoveContainer" containerID="d7e3f25f398c9230aa728260c9f128890931fd534923084cf3e18d565b8c6014" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.678111 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecb9caa1-78bc-4cc0-848a-9a6afca67af0","Type":"ContainerStarted","Data":"3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103"} Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.678138 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecb9caa1-78bc-4cc0-848a-9a6afca67af0","Type":"ContainerStarted","Data":"a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a"} Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.678153 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecb9caa1-78bc-4cc0-848a-9a6afca67af0","Type":"ContainerStarted","Data":"4a7004dde6e1616f9908cc5bfe3cfcbac6cfa2df851a2f8c92ebdf856c06370b"} Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.711689 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.726203 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.736199 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.737890 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.740258 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.740882 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.750268 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.758069 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.758035703 podStartE2EDuration="2.758035703s" podCreationTimestamp="2025-10-01 11:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:19.721021011 +0000 UTC m=+1190.820585988" watchObservedRunningTime="2025-10-01 11:48:19.758035703 +0000 UTC m=+1190.857600680" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.796798 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.837034 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.837479 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.837525 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.837597 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zghq7\" (UniqueName: \"kubernetes.io/projected/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-api-access-zghq7\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.940285 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.940392 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zghq7\" (UniqueName: \"kubernetes.io/projected/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-api-access-zghq7\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.940509 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.940563 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.946910 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.948133 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.949411 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.949447 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.949990 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:19 crc kubenswrapper[4669]: I1001 11:48:19.962805 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zghq7\" (UniqueName: \"kubernetes.io/projected/d4ec071d-763f-4513-8e0b-30fd6c1980d0-kube-api-access-zghq7\") pod \"kube-state-metrics-0\" (UID: \"d4ec071d-763f-4513-8e0b-30fd6c1980d0\") " pod="openstack/kube-state-metrics-0" Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.055042 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.305563 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.306422 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="sg-core" containerID="cri-o://ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30" gracePeriod=30 Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.306474 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-notification-agent" containerID="cri-o://f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7" gracePeriod=30 Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.306448 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="proxy-httpd" containerID="cri-o://c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4" gracePeriod=30 Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.306783 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-central-agent" containerID="cri-o://3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6" gracePeriod=30 Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.532073 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 11:48:20 crc kubenswrapper[4669]: W1001 11:48:20.534348 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ec071d_763f_4513_8e0b_30fd6c1980d0.slice/crio-412820fd22f76d68a087dd36a272b68d286a54d017554cd535f6bda9ba1e9e8e WatchSource:0}: Error finding container 412820fd22f76d68a087dd36a272b68d286a54d017554cd535f6bda9ba1e9e8e: Status 404 returned error can't find the container with id 412820fd22f76d68a087dd36a272b68d286a54d017554cd535f6bda9ba1e9e8e Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.701762 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9e402de8-e8f7-4c99-8a1a-58e95ef031f2","Type":"ContainerStarted","Data":"de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3"} Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.701818 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9e402de8-e8f7-4c99-8a1a-58e95ef031f2","Type":"ContainerStarted","Data":"fbbeb3a8b9880d697ff39e17b00fb77a9b6312f91b5720d0807434a8e9bf669e"} Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.706997 4669 generic.go:334] "Generic (PLEG): container finished" podID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerID="c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4" exitCode=0 Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.707054 4669 generic.go:334] "Generic (PLEG): container finished" podID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerID="ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30" exitCode=2 Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.707087 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerDied","Data":"c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4"} Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.707124 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerDied","Data":"ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30"} Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.708873 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d4ec071d-763f-4513-8e0b-30fd6c1980d0","Type":"ContainerStarted","Data":"412820fd22f76d68a087dd36a272b68d286a54d017554cd535f6bda9ba1e9e8e"} Oct 01 11:48:20 crc kubenswrapper[4669]: I1001 11:48:20.728326 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.728299662 podStartE2EDuration="2.728299662s" podCreationTimestamp="2025-10-01 11:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:20.720574534 +0000 UTC m=+1191.820139511" watchObservedRunningTime="2025-10-01 11:48:20.728299662 +0000 UTC m=+1191.827864639" Oct 01 11:48:21 crc kubenswrapper[4669]: I1001 11:48:21.658486 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d832718-661a-44fb-bcc8-7f48af908b15" path="/var/lib/kubelet/pods/9d832718-661a-44fb-bcc8-7f48af908b15/volumes" Oct 01 11:48:21 crc kubenswrapper[4669]: I1001 11:48:21.731941 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d4ec071d-763f-4513-8e0b-30fd6c1980d0","Type":"ContainerStarted","Data":"c1962004026ed3ce9bc5cea9c57e7d48a2a83e95ca4f287bfc1ed22849e83ff0"} Oct 01 11:48:21 crc kubenswrapper[4669]: I1001 11:48:21.732057 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 11:48:21 crc kubenswrapper[4669]: I1001 11:48:21.738144 4669 generic.go:334] "Generic (PLEG): container finished" podID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerID="3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6" exitCode=0 Oct 01 11:48:21 crc kubenswrapper[4669]: I1001 11:48:21.738194 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerDied","Data":"3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6"} Oct 01 11:48:21 crc kubenswrapper[4669]: I1001 11:48:21.759622 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.285688242 podStartE2EDuration="2.759599248s" podCreationTimestamp="2025-10-01 11:48:19 +0000 UTC" firstStartedPulling="2025-10-01 11:48:20.537329616 +0000 UTC m=+1191.636894593" lastFinishedPulling="2025-10-01 11:48:21.011240612 +0000 UTC m=+1192.110805599" observedRunningTime="2025-10-01 11:48:21.750531018 +0000 UTC m=+1192.850095995" watchObservedRunningTime="2025-10-01 11:48:21.759599248 +0000 UTC m=+1192.859164225" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.037416 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.482826 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624195 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-sg-core-conf-yaml\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624253 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-scripts\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624328 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-log-httpd\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624373 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-combined-ca-bundle\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624445 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qffmr\" (UniqueName: \"kubernetes.io/projected/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-kube-api-access-qffmr\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624526 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-config-data\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.624739 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-run-httpd\") pod \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\" (UID: \"e8d559a1-8d44-41cf-a42f-51ab2b87d60f\") " Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.625186 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.625364 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.626629 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.633422 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-kube-api-access-qffmr" (OuterVolumeSpecName: "kube-api-access-qffmr") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "kube-api-access-qffmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.633934 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-scripts" (OuterVolumeSpecName: "scripts") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.658047 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.710545 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.727469 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.727496 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.727517 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qffmr\" (UniqueName: \"kubernetes.io/projected/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-kube-api-access-qffmr\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.727527 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.727537 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.732824 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-config-data" (OuterVolumeSpecName: "config-data") pod "e8d559a1-8d44-41cf-a42f-51ab2b87d60f" (UID: "e8d559a1-8d44-41cf-a42f-51ab2b87d60f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.754628 4669 generic.go:334] "Generic (PLEG): container finished" podID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerID="f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7" exitCode=0 Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.754693 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.754782 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerDied","Data":"f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7"} Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.754820 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d559a1-8d44-41cf-a42f-51ab2b87d60f","Type":"ContainerDied","Data":"caa33e2b2841761364de7bdc39c0f693e2120fbbbb3f7e04fcc94836510737c6"} Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.754841 4669 scope.go:117] "RemoveContainer" containerID="c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.789524 4669 scope.go:117] "RemoveContainer" containerID="ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.793583 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.801168 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.819479 4669 scope.go:117] "RemoveContainer" containerID="f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.830436 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d559a1-8d44-41cf-a42f-51ab2b87d60f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.840438 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.847316 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="proxy-httpd" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.847374 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="proxy-httpd" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.847433 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-central-agent" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.847442 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-central-agent" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.847466 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="sg-core" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.847474 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="sg-core" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.847511 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-notification-agent" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.847519 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-notification-agent" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.847956 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-central-agent" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.847987 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="ceilometer-notification-agent" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.848002 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="sg-core" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.848009 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" containerName="proxy-httpd" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.850070 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.858521 4669 scope.go:117] "RemoveContainer" containerID="3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.859519 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.862982 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.865715 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.867854 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.895121 4669 scope.go:117] "RemoveContainer" containerID="c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.895969 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4\": container with ID starting with c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4 not found: ID does not exist" containerID="c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.896025 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4"} err="failed to get container status \"c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4\": rpc error: code = NotFound desc = could not find container \"c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4\": container with ID starting with c170e654345b118a4639a860c83af53948edca73ca03961232efd078a491bba4 not found: ID does not exist" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.896054 4669 scope.go:117] "RemoveContainer" containerID="ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.897152 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30\": container with ID starting with ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30 not found: ID does not exist" containerID="ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.897177 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30"} err="failed to get container status \"ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30\": rpc error: code = NotFound desc = could not find container \"ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30\": container with ID starting with ffb82588f0e2b92360fd7689ebb8400283cb1ec008a20007ceb96ce0d2e30c30 not found: ID does not exist" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.897193 4669 scope.go:117] "RemoveContainer" containerID="f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.899174 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7\": container with ID starting with f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7 not found: ID does not exist" containerID="f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.899195 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7"} err="failed to get container status \"f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7\": rpc error: code = NotFound desc = could not find container \"f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7\": container with ID starting with f7ed5e1eb98e13e597a905e3c2fffc15971fc766614326677f6eff22587996d7 not found: ID does not exist" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.899208 4669 scope.go:117] "RemoveContainer" containerID="3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6" Oct 01 11:48:22 crc kubenswrapper[4669]: E1001 11:48:22.900065 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6\": container with ID starting with 3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6 not found: ID does not exist" containerID="3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.900139 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6"} err="failed to get container status \"3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6\": rpc error: code = NotFound desc = could not find container \"3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6\": container with ID starting with 3ab275ec9d88c7f1262c33919a2da6a73175a0a5959e4ae123acc024d7f16fe6 not found: ID does not exist" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931608 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-run-httpd\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931699 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931735 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4zz\" (UniqueName: \"kubernetes.io/projected/d8ae56ce-101e-4066-848d-3f979af046be-kube-api-access-pt4zz\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931758 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931773 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-log-httpd\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931792 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-scripts\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931832 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-config-data\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:22 crc kubenswrapper[4669]: I1001 11:48:22.931854 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.035234 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-run-httpd\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.035739 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.035930 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-run-httpd\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.036213 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4zz\" (UniqueName: \"kubernetes.io/projected/d8ae56ce-101e-4066-848d-3f979af046be-kube-api-access-pt4zz\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.036456 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.036729 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-log-httpd\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.037841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-scripts\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.038243 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-config-data\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.038490 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.037380 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-log-httpd\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.043872 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.047141 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.054117 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-config-data\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.054358 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-scripts\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.054597 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.067825 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4zz\" (UniqueName: \"kubernetes.io/projected/d8ae56ce-101e-4066-848d-3f979af046be-kube-api-access-pt4zz\") pod \"ceilometer-0\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.171781 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.668959 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d559a1-8d44-41cf-a42f-51ab2b87d60f" path="/var/lib/kubelet/pods/e8d559a1-8d44-41cf-a42f-51ab2b87d60f/volumes" Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.727941 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:23 crc kubenswrapper[4669]: W1001 11:48:23.741112 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8ae56ce_101e_4066_848d_3f979af046be.slice/crio-f7f56030fe8738100dacc1b0d86bce5996ff94c9e7b495a0bca26eb643558684 WatchSource:0}: Error finding container f7f56030fe8738100dacc1b0d86bce5996ff94c9e7b495a0bca26eb643558684: Status 404 returned error can't find the container with id f7f56030fe8738100dacc1b0d86bce5996ff94c9e7b495a0bca26eb643558684 Oct 01 11:48:23 crc kubenswrapper[4669]: I1001 11:48:23.775098 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerStarted","Data":"f7f56030fe8738100dacc1b0d86bce5996ff94c9e7b495a0bca26eb643558684"} Oct 01 11:48:24 crc kubenswrapper[4669]: I1001 11:48:24.306623 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 11:48:24 crc kubenswrapper[4669]: I1001 11:48:24.790912 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerStarted","Data":"b021b7c117ef49bef8f57eb7d62713a50d0d0a9f4176817a5cace07da6f711e6"} Oct 01 11:48:24 crc kubenswrapper[4669]: I1001 11:48:24.949157 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 11:48:24 crc kubenswrapper[4669]: I1001 11:48:24.949228 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 11:48:25 crc kubenswrapper[4669]: I1001 11:48:25.804746 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerStarted","Data":"e83d985540ab9099a134863c33ed415ca9aa6cb8a08fcf0b5c3e8085f7d9f7cd"} Oct 01 11:48:25 crc kubenswrapper[4669]: I1001 11:48:25.973498 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:25 crc kubenswrapper[4669]: I1001 11:48:25.973541 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:26 crc kubenswrapper[4669]: I1001 11:48:26.820406 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerStarted","Data":"9f2064a85363ebee355425c2e8977d25f59e0ed4f25ade8bd2134a6081ad6518"} Oct 01 11:48:28 crc kubenswrapper[4669]: I1001 11:48:28.033408 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:48:28 crc kubenswrapper[4669]: I1001 11:48:28.034036 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:48:28 crc kubenswrapper[4669]: I1001 11:48:28.856373 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerStarted","Data":"bde7ff022061983e6a3e6ee798b92f6e3dc94be33ab7e033f24171d913c9d139"} Oct 01 11:48:28 crc kubenswrapper[4669]: I1001 11:48:28.902904 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.887693212 podStartE2EDuration="6.902875015s" podCreationTimestamp="2025-10-01 11:48:22 +0000 UTC" firstStartedPulling="2025-10-01 11:48:23.747120761 +0000 UTC m=+1194.846685748" lastFinishedPulling="2025-10-01 11:48:27.762302564 +0000 UTC m=+1198.861867551" observedRunningTime="2025-10-01 11:48:28.902321281 +0000 UTC m=+1200.001886278" watchObservedRunningTime="2025-10-01 11:48:28.902875015 +0000 UTC m=+1200.002440012" Oct 01 11:48:29 crc kubenswrapper[4669]: I1001 11:48:29.116327 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:29 crc kubenswrapper[4669]: I1001 11:48:29.116704 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:29 crc kubenswrapper[4669]: I1001 11:48:29.306293 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 11:48:29 crc kubenswrapper[4669]: I1001 11:48:29.347235 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 11:48:29 crc kubenswrapper[4669]: I1001 11:48:29.868178 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:48:29 crc kubenswrapper[4669]: I1001 11:48:29.911769 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 11:48:30 crc kubenswrapper[4669]: I1001 11:48:30.084404 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 11:48:31 crc kubenswrapper[4669]: I1001 11:48:31.863829 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:48:31 crc kubenswrapper[4669]: I1001 11:48:31.864935 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:48:34 crc kubenswrapper[4669]: I1001 11:48:34.954910 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 11:48:34 crc kubenswrapper[4669]: I1001 11:48:34.961868 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 11:48:34 crc kubenswrapper[4669]: I1001 11:48:34.965302 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 11:48:35 crc kubenswrapper[4669]: I1001 11:48:35.943697 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.692231 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.792738 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4954\" (UniqueName: \"kubernetes.io/projected/86b8cae7-2784-4192-8b18-3cd38d6123f3-kube-api-access-c4954\") pod \"86b8cae7-2784-4192-8b18-3cd38d6123f3\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.792913 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-combined-ca-bundle\") pod \"86b8cae7-2784-4192-8b18-3cd38d6123f3\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.792996 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-config-data\") pod \"86b8cae7-2784-4192-8b18-3cd38d6123f3\" (UID: \"86b8cae7-2784-4192-8b18-3cd38d6123f3\") " Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.815146 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b8cae7-2784-4192-8b18-3cd38d6123f3-kube-api-access-c4954" (OuterVolumeSpecName: "kube-api-access-c4954") pod "86b8cae7-2784-4192-8b18-3cd38d6123f3" (UID: "86b8cae7-2784-4192-8b18-3cd38d6123f3"). InnerVolumeSpecName "kube-api-access-c4954". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.827628 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-config-data" (OuterVolumeSpecName: "config-data") pod "86b8cae7-2784-4192-8b18-3cd38d6123f3" (UID: "86b8cae7-2784-4192-8b18-3cd38d6123f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.829754 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86b8cae7-2784-4192-8b18-3cd38d6123f3" (UID: "86b8cae7-2784-4192-8b18-3cd38d6123f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.896516 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4954\" (UniqueName: \"kubernetes.io/projected/86b8cae7-2784-4192-8b18-3cd38d6123f3-kube-api-access-c4954\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.896557 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.896568 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b8cae7-2784-4192-8b18-3cd38d6123f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.948657 4669 generic.go:334] "Generic (PLEG): container finished" podID="86b8cae7-2784-4192-8b18-3cd38d6123f3" containerID="1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87" exitCode=137 Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.948705 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.948805 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86b8cae7-2784-4192-8b18-3cd38d6123f3","Type":"ContainerDied","Data":"1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87"} Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.948853 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86b8cae7-2784-4192-8b18-3cd38d6123f3","Type":"ContainerDied","Data":"6099c7507ac33384031bdabff7c46594c39a6b6653abbf652108af4b70bee882"} Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.948883 4669 scope.go:117] "RemoveContainer" containerID="1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.975224 4669 scope.go:117] "RemoveContainer" containerID="1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87" Oct 01 11:48:36 crc kubenswrapper[4669]: E1001 11:48:36.975849 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87\": container with ID starting with 1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87 not found: ID does not exist" containerID="1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.975919 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87"} err="failed to get container status \"1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87\": rpc error: code = NotFound desc = could not find container \"1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87\": container with ID starting with 1aa2fdab9c69a27ef0dc80599d8998689f145bdbacab0adf4b72683632135b87 not found: ID does not exist" Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.991301 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:36 crc kubenswrapper[4669]: I1001 11:48:36.999282 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.021155 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:37 crc kubenswrapper[4669]: E1001 11:48:37.021754 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b8cae7-2784-4192-8b18-3cd38d6123f3" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.021780 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b8cae7-2784-4192-8b18-3cd38d6123f3" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.022092 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b8cae7-2784-4192-8b18-3cd38d6123f3" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.022968 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.030314 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.030742 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.030910 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.048568 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.100164 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.100488 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.100667 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.101029 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrr7q\" (UniqueName: \"kubernetes.io/projected/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-kube-api-access-mrr7q\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.101205 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.203386 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.203695 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrr7q\" (UniqueName: \"kubernetes.io/projected/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-kube-api-access-mrr7q\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.203730 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.203856 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.203884 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.210223 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.210305 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.211541 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.213158 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.230918 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrr7q\" (UniqueName: \"kubernetes.io/projected/ef9631f5-92a1-4d2b-a5a6-25b60a609d61-kube-api-access-mrr7q\") pod \"nova-cell1-novncproxy-0\" (UID: \"ef9631f5-92a1-4d2b-a5a6-25b60a609d61\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.357385 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.658930 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b8cae7-2784-4192-8b18-3cd38d6123f3" path="/var/lib/kubelet/pods/86b8cae7-2784-4192-8b18-3cd38d6123f3/volumes" Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.893544 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 11:48:37 crc kubenswrapper[4669]: W1001 11:48:37.896721 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9631f5_92a1_4d2b_a5a6_25b60a609d61.slice/crio-8d9886fff6d1f39cf2e1c4d5d2bdbe4c1b807e44df2e85de80a1ef376328954b WatchSource:0}: Error finding container 8d9886fff6d1f39cf2e1c4d5d2bdbe4c1b807e44df2e85de80a1ef376328954b: Status 404 returned error can't find the container with id 8d9886fff6d1f39cf2e1c4d5d2bdbe4c1b807e44df2e85de80a1ef376328954b Oct 01 11:48:37 crc kubenswrapper[4669]: I1001 11:48:37.968667 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ef9631f5-92a1-4d2b-a5a6-25b60a609d61","Type":"ContainerStarted","Data":"8d9886fff6d1f39cf2e1c4d5d2bdbe4c1b807e44df2e85de80a1ef376328954b"} Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.038975 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.039633 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.040023 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.044845 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.982508 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ef9631f5-92a1-4d2b-a5a6-25b60a609d61","Type":"ContainerStarted","Data":"7a798f7b5720f0507e1069c0c0c7f5077275fabe7ef1be2707f4d6f1ec744a9b"} Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.982872 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 11:48:38 crc kubenswrapper[4669]: I1001 11:48:38.991942 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.023551 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.023518509 podStartE2EDuration="3.023518509s" podCreationTimestamp="2025-10-01 11:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:39.013236889 +0000 UTC m=+1210.112801886" watchObservedRunningTime="2025-10-01 11:48:39.023518509 +0000 UTC m=+1210.123083496" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.223778 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gltl2"] Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.225961 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.254340 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gltl2"] Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.354126 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.354263 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-config\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.354318 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.354356 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gcm\" (UniqueName: \"kubernetes.io/projected/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-kube-api-access-z6gcm\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.354411 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.354456 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.456653 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.457003 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.457194 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.457362 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-config\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.457472 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.457579 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gcm\" (UniqueName: \"kubernetes.io/projected/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-kube-api-access-z6gcm\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.458922 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.459310 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.459379 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.459632 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.459833 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-config\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.500364 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gcm\" (UniqueName: \"kubernetes.io/projected/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-kube-api-access-z6gcm\") pod \"dnsmasq-dns-59cf4bdb65-gltl2\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:39 crc kubenswrapper[4669]: I1001 11:48:39.571127 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:40 crc kubenswrapper[4669]: I1001 11:48:40.076202 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gltl2"] Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.011133 4669 generic.go:334] "Generic (PLEG): container finished" podID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerID="5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296" exitCode=0 Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.011200 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" event={"ID":"ddc3cccf-6f89-44ac-a85c-8ab53a95493b","Type":"ContainerDied","Data":"5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296"} Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.011314 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" event={"ID":"ddc3cccf-6f89-44ac-a85c-8ab53a95493b","Type":"ContainerStarted","Data":"1c513d5fc1a6693d07435aa9e85e5082e47dc9c732069cb5f6a8048456306fb3"} Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.383499 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.384106 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-central-agent" containerID="cri-o://b021b7c117ef49bef8f57eb7d62713a50d0d0a9f4176817a5cace07da6f711e6" gracePeriod=30 Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.385173 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-notification-agent" containerID="cri-o://e83d985540ab9099a134863c33ed415ca9aa6cb8a08fcf0b5c3e8085f7d9f7cd" gracePeriod=30 Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.385314 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="proxy-httpd" containerID="cri-o://bde7ff022061983e6a3e6ee798b92f6e3dc94be33ab7e033f24171d913c9d139" gracePeriod=30 Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.385402 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="sg-core" containerID="cri-o://9f2064a85363ebee355425c2e8977d25f59e0ed4f25ade8bd2134a6081ad6518" gracePeriod=30 Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.415335 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.198:3000/\": EOF" Oct 01 11:48:41 crc kubenswrapper[4669]: I1001 11:48:41.901379 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.023956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" event={"ID":"ddc3cccf-6f89-44ac-a85c-8ab53a95493b","Type":"ContainerStarted","Data":"cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc"} Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.024068 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.027547 4669 generic.go:334] "Generic (PLEG): container finished" podID="d8ae56ce-101e-4066-848d-3f979af046be" containerID="bde7ff022061983e6a3e6ee798b92f6e3dc94be33ab7e033f24171d913c9d139" exitCode=0 Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.027578 4669 generic.go:334] "Generic (PLEG): container finished" podID="d8ae56ce-101e-4066-848d-3f979af046be" containerID="9f2064a85363ebee355425c2e8977d25f59e0ed4f25ade8bd2134a6081ad6518" exitCode=2 Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.027746 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-log" containerID="cri-o://a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a" gracePeriod=30 Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.027900 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-api" containerID="cri-o://3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103" gracePeriod=30 Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.027907 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerDied","Data":"bde7ff022061983e6a3e6ee798b92f6e3dc94be33ab7e033f24171d913c9d139"} Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.028036 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerDied","Data":"9f2064a85363ebee355425c2e8977d25f59e0ed4f25ade8bd2134a6081ad6518"} Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.056054 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" podStartSLOduration=3.055901329 podStartE2EDuration="3.055901329s" podCreationTimestamp="2025-10-01 11:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:42.055269163 +0000 UTC m=+1213.154834140" watchObservedRunningTime="2025-10-01 11:48:42.055901329 +0000 UTC m=+1213.155466296" Oct 01 11:48:42 crc kubenswrapper[4669]: I1001 11:48:42.358542 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:43 crc kubenswrapper[4669]: I1001 11:48:43.039533 4669 generic.go:334] "Generic (PLEG): container finished" podID="d8ae56ce-101e-4066-848d-3f979af046be" containerID="b021b7c117ef49bef8f57eb7d62713a50d0d0a9f4176817a5cace07da6f711e6" exitCode=0 Oct 01 11:48:43 crc kubenswrapper[4669]: I1001 11:48:43.039720 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerDied","Data":"b021b7c117ef49bef8f57eb7d62713a50d0d0a9f4176817a5cace07da6f711e6"} Oct 01 11:48:43 crc kubenswrapper[4669]: I1001 11:48:43.042667 4669 generic.go:334] "Generic (PLEG): container finished" podID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerID="a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a" exitCode=143 Oct 01 11:48:43 crc kubenswrapper[4669]: I1001 11:48:43.042735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecb9caa1-78bc-4cc0-848a-9a6afca67af0","Type":"ContainerDied","Data":"a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a"} Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.715591 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.856898 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-config-data\") pod \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.857005 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrhn8\" (UniqueName: \"kubernetes.io/projected/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-kube-api-access-nrhn8\") pod \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.857182 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-logs\") pod \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.857374 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-combined-ca-bundle\") pod \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\" (UID: \"ecb9caa1-78bc-4cc0-848a-9a6afca67af0\") " Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.857662 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-logs" (OuterVolumeSpecName: "logs") pod "ecb9caa1-78bc-4cc0-848a-9a6afca67af0" (UID: "ecb9caa1-78bc-4cc0-848a-9a6afca67af0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.872563 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-kube-api-access-nrhn8" (OuterVolumeSpecName: "kube-api-access-nrhn8") pod "ecb9caa1-78bc-4cc0-848a-9a6afca67af0" (UID: "ecb9caa1-78bc-4cc0-848a-9a6afca67af0"). InnerVolumeSpecName "kube-api-access-nrhn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.938507 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-config-data" (OuterVolumeSpecName: "config-data") pod "ecb9caa1-78bc-4cc0-848a-9a6afca67af0" (UID: "ecb9caa1-78bc-4cc0-848a-9a6afca67af0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.959615 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.959659 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.959673 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrhn8\" (UniqueName: \"kubernetes.io/projected/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-kube-api-access-nrhn8\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:45 crc kubenswrapper[4669]: I1001 11:48:45.986491 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecb9caa1-78bc-4cc0-848a-9a6afca67af0" (UID: "ecb9caa1-78bc-4cc0-848a-9a6afca67af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.062890 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecb9caa1-78bc-4cc0-848a-9a6afca67af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.077264 4669 generic.go:334] "Generic (PLEG): container finished" podID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerID="3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103" exitCode=0 Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.077469 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecb9caa1-78bc-4cc0-848a-9a6afca67af0","Type":"ContainerDied","Data":"3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103"} Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.077617 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecb9caa1-78bc-4cc0-848a-9a6afca67af0","Type":"ContainerDied","Data":"4a7004dde6e1616f9908cc5bfe3cfcbac6cfa2df851a2f8c92ebdf856c06370b"} Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.077714 4669 scope.go:117] "RemoveContainer" containerID="3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.077929 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.119337 4669 scope.go:117] "RemoveContainer" containerID="a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.122687 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.133274 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.171337 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:46 crc kubenswrapper[4669]: E1001 11:48:46.171962 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-api" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.171986 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-api" Oct 01 11:48:46 crc kubenswrapper[4669]: E1001 11:48:46.172006 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-log" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.172014 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-log" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.172325 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-log" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.172351 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" containerName="nova-api-api" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.172381 4669 scope.go:117] "RemoveContainer" containerID="3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.174073 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: E1001 11:48:46.176589 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103\": container with ID starting with 3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103 not found: ID does not exist" containerID="3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.176640 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103"} err="failed to get container status \"3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103\": rpc error: code = NotFound desc = could not find container \"3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103\": container with ID starting with 3ba8618babd303a1bdc2aa423e36e99c0fa364ec6a15d095ee6caf56bd9a6103 not found: ID does not exist" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.176686 4669 scope.go:117] "RemoveContainer" containerID="a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a" Oct 01 11:48:46 crc kubenswrapper[4669]: E1001 11:48:46.178922 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a\": container with ID starting with a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a not found: ID does not exist" containerID="a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.178999 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a"} err="failed to get container status \"a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a\": rpc error: code = NotFound desc = could not find container \"a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a\": container with ID starting with a7d6b0086ea41becf95bb86c0f14ca13d2453a1291244c2d135241f457556e9a not found: ID does not exist" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.180597 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.180955 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.183677 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.206069 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.266709 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.266778 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8kxb\" (UniqueName: \"kubernetes.io/projected/6a472690-2229-4a59-9ba7-8bd28109cb5c-kube-api-access-f8kxb\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.266809 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.267006 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a472690-2229-4a59-9ba7-8bd28109cb5c-logs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.267305 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-config-data\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.267512 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.370272 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.370429 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.370468 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8kxb\" (UniqueName: \"kubernetes.io/projected/6a472690-2229-4a59-9ba7-8bd28109cb5c-kube-api-access-f8kxb\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.370509 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.370561 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a472690-2229-4a59-9ba7-8bd28109cb5c-logs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.370647 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-config-data\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.371822 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a472690-2229-4a59-9ba7-8bd28109cb5c-logs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.378207 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.378495 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.378660 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.379138 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-config-data\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.408211 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8kxb\" (UniqueName: \"kubernetes.io/projected/6a472690-2229-4a59-9ba7-8bd28109cb5c-kube-api-access-f8kxb\") pod \"nova-api-0\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " pod="openstack/nova-api-0" Oct 01 11:48:46 crc kubenswrapper[4669]: I1001 11:48:46.516238 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.010588 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:47 crc kubenswrapper[4669]: W1001 11:48:47.032847 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a472690_2229_4a59_9ba7_8bd28109cb5c.slice/crio-3a5ffedccec3a50a97be00a96ee9f2efa8749a7adea54a126b559d554e1aa869 WatchSource:0}: Error finding container 3a5ffedccec3a50a97be00a96ee9f2efa8749a7adea54a126b559d554e1aa869: Status 404 returned error can't find the container with id 3a5ffedccec3a50a97be00a96ee9f2efa8749a7adea54a126b559d554e1aa869 Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.121273 4669 generic.go:334] "Generic (PLEG): container finished" podID="d8ae56ce-101e-4066-848d-3f979af046be" containerID="e83d985540ab9099a134863c33ed415ca9aa6cb8a08fcf0b5c3e8085f7d9f7cd" exitCode=0 Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.121372 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerDied","Data":"e83d985540ab9099a134863c33ed415ca9aa6cb8a08fcf0b5c3e8085f7d9f7cd"} Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.122984 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a472690-2229-4a59-9ba7-8bd28109cb5c","Type":"ContainerStarted","Data":"3a5ffedccec3a50a97be00a96ee9f2efa8749a7adea54a126b559d554e1aa869"} Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.157825 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290070 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-run-httpd\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290163 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-ceilometer-tls-certs\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290192 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-config-data\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290255 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-log-httpd\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290450 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-scripts\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290509 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4zz\" (UniqueName: \"kubernetes.io/projected/d8ae56ce-101e-4066-848d-3f979af046be-kube-api-access-pt4zz\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290565 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-sg-core-conf-yaml\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290595 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-combined-ca-bundle\") pod \"d8ae56ce-101e-4066-848d-3f979af046be\" (UID: \"d8ae56ce-101e-4066-848d-3f979af046be\") " Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.290864 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.291678 4669 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.292920 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.300240 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-scripts" (OuterVolumeSpecName: "scripts") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.300553 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ae56ce-101e-4066-848d-3f979af046be-kube-api-access-pt4zz" (OuterVolumeSpecName: "kube-api-access-pt4zz") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "kube-api-access-pt4zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.332391 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.353800 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.359353 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.392637 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.393951 4669 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8ae56ce-101e-4066-848d-3f979af046be-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.393986 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.393999 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4zz\" (UniqueName: \"kubernetes.io/projected/d8ae56ce-101e-4066-848d-3f979af046be-kube-api-access-pt4zz\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.394014 4669 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.394026 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.403356 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.422674 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-config-data" (OuterVolumeSpecName: "config-data") pod "d8ae56ce-101e-4066-848d-3f979af046be" (UID: "d8ae56ce-101e-4066-848d-3f979af046be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.495788 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.495827 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ae56ce-101e-4066-848d-3f979af046be-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:47 crc kubenswrapper[4669]: I1001 11:48:47.664817 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb9caa1-78bc-4cc0-848a-9a6afca67af0" path="/var/lib/kubelet/pods/ecb9caa1-78bc-4cc0-848a-9a6afca67af0/volumes" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.138928 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8ae56ce-101e-4066-848d-3f979af046be","Type":"ContainerDied","Data":"f7f56030fe8738100dacc1b0d86bce5996ff94c9e7b495a0bca26eb643558684"} Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.139031 4669 scope.go:117] "RemoveContainer" containerID="bde7ff022061983e6a3e6ee798b92f6e3dc94be33ab7e033f24171d913c9d139" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.139398 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.141893 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a472690-2229-4a59-9ba7-8bd28109cb5c","Type":"ContainerStarted","Data":"0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4"} Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.141934 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a472690-2229-4a59-9ba7-8bd28109cb5c","Type":"ContainerStarted","Data":"06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490"} Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.161032 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.176797 4669 scope.go:117] "RemoveContainer" containerID="9f2064a85363ebee355425c2e8977d25f59e0ed4f25ade8bd2134a6081ad6518" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.183054 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.183030639 podStartE2EDuration="2.183030639s" podCreationTimestamp="2025-10-01 11:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:48.173117997 +0000 UTC m=+1219.272682974" watchObservedRunningTime="2025-10-01 11:48:48.183030639 +0000 UTC m=+1219.282595616" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.208495 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.226371 4669 scope.go:117] "RemoveContainer" containerID="e83d985540ab9099a134863c33ed415ca9aa6cb8a08fcf0b5c3e8085f7d9f7cd" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.244796 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.260847 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:48 crc kubenswrapper[4669]: E1001 11:48:48.261485 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="proxy-httpd" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261513 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="proxy-httpd" Oct 01 11:48:48 crc kubenswrapper[4669]: E1001 11:48:48.261535 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="sg-core" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261544 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="sg-core" Oct 01 11:48:48 crc kubenswrapper[4669]: E1001 11:48:48.261577 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-central-agent" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261586 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-central-agent" Oct 01 11:48:48 crc kubenswrapper[4669]: E1001 11:48:48.261626 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-notification-agent" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261636 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-notification-agent" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261902 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="proxy-httpd" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261938 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-central-agent" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261964 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="ceilometer-notification-agent" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.261979 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ae56ce-101e-4066-848d-3f979af046be" containerName="sg-core" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.264480 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.269530 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.269814 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.269983 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.310129 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.322903 4669 scope.go:117] "RemoveContainer" containerID="b021b7c117ef49bef8f57eb7d62713a50d0d0a9f4176817a5cace07da6f711e6" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422406 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-config-data\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422484 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-run-httpd\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422517 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422608 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422641 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422663 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-log-httpd\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422693 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6b8s\" (UniqueName: \"kubernetes.io/projected/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-kube-api-access-c6b8s\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.422718 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-scripts\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.464881 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hdx2s"] Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.466489 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.469447 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.469942 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.492963 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hdx2s"] Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524312 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6b8s\" (UniqueName: \"kubernetes.io/projected/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-kube-api-access-c6b8s\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524374 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-scripts\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524465 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-config-data\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524496 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-run-httpd\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524582 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524611 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.524629 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-log-httpd\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.525255 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-log-httpd\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.526196 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-run-httpd\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.531156 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.531708 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.532464 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.533108 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-config-data\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.539639 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-scripts\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.542897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6b8s\" (UniqueName: \"kubernetes.io/projected/8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272-kube-api-access-c6b8s\") pod \"ceilometer-0\" (UID: \"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272\") " pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.619411 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.626582 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bw55\" (UniqueName: \"kubernetes.io/projected/694af0ac-d829-4caa-8350-1861400d0438-kube-api-access-2bw55\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.626980 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-scripts\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.627478 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.627922 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-config-data\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.731120 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bw55\" (UniqueName: \"kubernetes.io/projected/694af0ac-d829-4caa-8350-1861400d0438-kube-api-access-2bw55\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.731188 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-scripts\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.731275 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.731337 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-config-data\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.741060 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.759687 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-scripts\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.762199 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-config-data\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.764286 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bw55\" (UniqueName: \"kubernetes.io/projected/694af0ac-d829-4caa-8350-1861400d0438-kube-api-access-2bw55\") pod \"nova-cell1-cell-mapping-hdx2s\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:48 crc kubenswrapper[4669]: I1001 11:48:48.795018 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:49 crc kubenswrapper[4669]: I1001 11:48:49.171570 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 11:48:49 crc kubenswrapper[4669]: I1001 11:48:49.281609 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hdx2s"] Oct 01 11:48:49 crc kubenswrapper[4669]: I1001 11:48:49.573336 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:48:49 crc kubenswrapper[4669]: I1001 11:48:49.707419 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ae56ce-101e-4066-848d-3f979af046be" path="/var/lib/kubelet/pods/d8ae56ce-101e-4066-848d-3f979af046be/volumes" Oct 01 11:48:49 crc kubenswrapper[4669]: I1001 11:48:49.708308 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-wfg8j"] Oct 01 11:48:49 crc kubenswrapper[4669]: I1001 11:48:49.708573 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerName="dnsmasq-dns" containerID="cri-o://08c22e60089a340143ab6e0e0af759c564feb95c9fe2cde22226bf2ef544c517" gracePeriod=10 Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.210119 4669 generic.go:334] "Generic (PLEG): container finished" podID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerID="08c22e60089a340143ab6e0e0af759c564feb95c9fe2cde22226bf2ef544c517" exitCode=0 Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.211419 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" event={"ID":"2ba0e6c3-7c71-45df-acc3-127bec5afd42","Type":"ContainerDied","Data":"08c22e60089a340143ab6e0e0af759c564feb95c9fe2cde22226bf2ef544c517"} Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.218984 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hdx2s" event={"ID":"694af0ac-d829-4caa-8350-1861400d0438","Type":"ContainerStarted","Data":"c764c22f963b6d085d64e6708f92d8490dcedce3a054b33c5e988108bff0293b"} Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.219251 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hdx2s" event={"ID":"694af0ac-d829-4caa-8350-1861400d0438","Type":"ContainerStarted","Data":"9034ff61020a71d35c8ccc143cb223cea52c39d6ddf331e7c1a694ac0e92e7c5"} Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.222284 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272","Type":"ContainerStarted","Data":"6cb38df3b0b4d477b58eb0d5672e3790944bbae9379ffcee0b0bf0e8f8b6fb3a"} Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.222414 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272","Type":"ContainerStarted","Data":"efc699adb8f0fb6aa8b5b5471c6e8d3969bb8accd0a425255bc4680da83ea3f5"} Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.244267 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hdx2s" podStartSLOduration=2.244244928 podStartE2EDuration="2.244244928s" podCreationTimestamp="2025-10-01 11:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:48:50.236626482 +0000 UTC m=+1221.336191459" watchObservedRunningTime="2025-10-01 11:48:50.244244928 +0000 UTC m=+1221.343809905" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.320012 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.375831 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-nb\") pod \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.376383 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-config\") pod \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.376541 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-sb\") pod \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.376702 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-svc\") pod \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.376835 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-swift-storage-0\") pod \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.377134 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-454lp\" (UniqueName: \"kubernetes.io/projected/2ba0e6c3-7c71-45df-acc3-127bec5afd42-kube-api-access-454lp\") pod \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\" (UID: \"2ba0e6c3-7c71-45df-acc3-127bec5afd42\") " Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.401782 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba0e6c3-7c71-45df-acc3-127bec5afd42-kube-api-access-454lp" (OuterVolumeSpecName: "kube-api-access-454lp") pod "2ba0e6c3-7c71-45df-acc3-127bec5afd42" (UID: "2ba0e6c3-7c71-45df-acc3-127bec5afd42"). InnerVolumeSpecName "kube-api-access-454lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.454685 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ba0e6c3-7c71-45df-acc3-127bec5afd42" (UID: "2ba0e6c3-7c71-45df-acc3-127bec5afd42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.474513 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ba0e6c3-7c71-45df-acc3-127bec5afd42" (UID: "2ba0e6c3-7c71-45df-acc3-127bec5afd42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.480635 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.480685 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-454lp\" (UniqueName: \"kubernetes.io/projected/2ba0e6c3-7c71-45df-acc3-127bec5afd42-kube-api-access-454lp\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.480700 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.491092 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-config" (OuterVolumeSpecName: "config") pod "2ba0e6c3-7c71-45df-acc3-127bec5afd42" (UID: "2ba0e6c3-7c71-45df-acc3-127bec5afd42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.494832 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ba0e6c3-7c71-45df-acc3-127bec5afd42" (UID: "2ba0e6c3-7c71-45df-acc3-127bec5afd42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.511398 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ba0e6c3-7c71-45df-acc3-127bec5afd42" (UID: "2ba0e6c3-7c71-45df-acc3-127bec5afd42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.583122 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.583427 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:50 crc kubenswrapper[4669]: I1001 11:48:50.583510 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ba0e6c3-7c71-45df-acc3-127bec5afd42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.236692 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" event={"ID":"2ba0e6c3-7c71-45df-acc3-127bec5afd42","Type":"ContainerDied","Data":"2dede691fa28e0d1a1f22827d6d512207a66f743103c9f2b622e04d1267c4b0b"} Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.239017 4669 scope.go:117] "RemoveContainer" containerID="08c22e60089a340143ab6e0e0af759c564feb95c9fe2cde22226bf2ef544c517" Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.237004 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-wfg8j" Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.240820 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272","Type":"ContainerStarted","Data":"92808de77324533b3232e42b9b7f0e296f1e484b2fa2994e81a504897a5cac3f"} Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.265881 4669 scope.go:117] "RemoveContainer" containerID="5745348de8bf960e21b71fc9926152c5fedc6f6584a3ab80e5ac06c24ea2c00c" Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.298996 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-wfg8j"] Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.311682 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-wfg8j"] Oct 01 11:48:51 crc kubenswrapper[4669]: I1001 11:48:51.659146 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" path="/var/lib/kubelet/pods/2ba0e6c3-7c71-45df-acc3-127bec5afd42/volumes" Oct 01 11:48:52 crc kubenswrapper[4669]: I1001 11:48:52.264380 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272","Type":"ContainerStarted","Data":"cf4dc817c8aca009dc080fdb091e34a067ee4d7144f058f0c55540fe7c60b73f"} Oct 01 11:48:54 crc kubenswrapper[4669]: I1001 11:48:54.301568 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272","Type":"ContainerStarted","Data":"dd7aea54cd68d8d6a20c265eebb6a9f5f027e75c095b7dfc1ec6ed8340a03721"} Oct 01 11:48:54 crc kubenswrapper[4669]: I1001 11:48:54.302427 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 11:48:54 crc kubenswrapper[4669]: I1001 11:48:54.334638 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.096367962 podStartE2EDuration="6.334605734s" podCreationTimestamp="2025-10-01 11:48:48 +0000 UTC" firstStartedPulling="2025-10-01 11:48:49.203006288 +0000 UTC m=+1220.302571265" lastFinishedPulling="2025-10-01 11:48:53.44124404 +0000 UTC m=+1224.540809037" observedRunningTime="2025-10-01 11:48:54.329678763 +0000 UTC m=+1225.429243750" watchObservedRunningTime="2025-10-01 11:48:54.334605734 +0000 UTC m=+1225.434170721" Oct 01 11:48:56 crc kubenswrapper[4669]: I1001 11:48:56.330771 4669 generic.go:334] "Generic (PLEG): container finished" podID="694af0ac-d829-4caa-8350-1861400d0438" containerID="c764c22f963b6d085d64e6708f92d8490dcedce3a054b33c5e988108bff0293b" exitCode=0 Oct 01 11:48:56 crc kubenswrapper[4669]: I1001 11:48:56.331445 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hdx2s" event={"ID":"694af0ac-d829-4caa-8350-1861400d0438","Type":"ContainerDied","Data":"c764c22f963b6d085d64e6708f92d8490dcedce3a054b33c5e988108bff0293b"} Oct 01 11:48:56 crc kubenswrapper[4669]: I1001 11:48:56.517849 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:48:56 crc kubenswrapper[4669]: I1001 11:48:56.520556 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.534215 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.534260 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.820657 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.891322 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-scripts\") pod \"694af0ac-d829-4caa-8350-1861400d0438\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.891424 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-config-data\") pod \"694af0ac-d829-4caa-8350-1861400d0438\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.891677 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-combined-ca-bundle\") pod \"694af0ac-d829-4caa-8350-1861400d0438\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.891779 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bw55\" (UniqueName: \"kubernetes.io/projected/694af0ac-d829-4caa-8350-1861400d0438-kube-api-access-2bw55\") pod \"694af0ac-d829-4caa-8350-1861400d0438\" (UID: \"694af0ac-d829-4caa-8350-1861400d0438\") " Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.898404 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-scripts" (OuterVolumeSpecName: "scripts") pod "694af0ac-d829-4caa-8350-1861400d0438" (UID: "694af0ac-d829-4caa-8350-1861400d0438"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.910445 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694af0ac-d829-4caa-8350-1861400d0438-kube-api-access-2bw55" (OuterVolumeSpecName: "kube-api-access-2bw55") pod "694af0ac-d829-4caa-8350-1861400d0438" (UID: "694af0ac-d829-4caa-8350-1861400d0438"). InnerVolumeSpecName "kube-api-access-2bw55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.932754 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-config-data" (OuterVolumeSpecName: "config-data") pod "694af0ac-d829-4caa-8350-1861400d0438" (UID: "694af0ac-d829-4caa-8350-1861400d0438"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.934206 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "694af0ac-d829-4caa-8350-1861400d0438" (UID: "694af0ac-d829-4caa-8350-1861400d0438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.996728 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bw55\" (UniqueName: \"kubernetes.io/projected/694af0ac-d829-4caa-8350-1861400d0438-kube-api-access-2bw55\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.996788 4669 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.996801 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:57 crc kubenswrapper[4669]: I1001 11:48:57.996814 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694af0ac-d829-4caa-8350-1861400d0438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.402004 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hdx2s" event={"ID":"694af0ac-d829-4caa-8350-1861400d0438","Type":"ContainerDied","Data":"9034ff61020a71d35c8ccc143cb223cea52c39d6ddf331e7c1a694ac0e92e7c5"} Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.402463 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9034ff61020a71d35c8ccc143cb223cea52c39d6ddf331e7c1a694ac0e92e7c5" Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.402217 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hdx2s" Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.588567 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.589488 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-api" containerID="cri-o://0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4" gracePeriod=30 Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.589677 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-log" containerID="cri-o://06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490" gracePeriod=30 Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.661208 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.661493 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" containerName="nova-scheduler-scheduler" containerID="cri-o://de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3" gracePeriod=30 Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.678966 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.679286 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-log" containerID="cri-o://0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3" gracePeriod=30 Oct 01 11:48:58 crc kubenswrapper[4669]: I1001 11:48:58.679487 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-metadata" containerID="cri-o://f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6" gracePeriod=30 Oct 01 11:48:59 crc kubenswrapper[4669]: E1001 11:48:59.308720 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 11:48:59 crc kubenswrapper[4669]: E1001 11:48:59.310594 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 11:48:59 crc kubenswrapper[4669]: E1001 11:48:59.319304 4669 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 11:48:59 crc kubenswrapper[4669]: E1001 11:48:59.319397 4669 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" containerName="nova-scheduler-scheduler" Oct 01 11:48:59 crc kubenswrapper[4669]: I1001 11:48:59.416647 4669 generic.go:334] "Generic (PLEG): container finished" podID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerID="06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490" exitCode=143 Oct 01 11:48:59 crc kubenswrapper[4669]: I1001 11:48:59.416785 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a472690-2229-4a59-9ba7-8bd28109cb5c","Type":"ContainerDied","Data":"06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490"} Oct 01 11:48:59 crc kubenswrapper[4669]: I1001 11:48:59.420299 4669 generic.go:334] "Generic (PLEG): container finished" podID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerID="0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3" exitCode=143 Oct 01 11:48:59 crc kubenswrapper[4669]: I1001 11:48:59.420346 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"285b539b-1b0c-4bb1-a197-ca28afe29810","Type":"ContainerDied","Data":"0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3"} Oct 01 11:49:01 crc kubenswrapper[4669]: I1001 11:49:01.814169 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:36908->10.217.0.193:8775: read: connection reset by peer" Oct 01 11:49:01 crc kubenswrapper[4669]: I1001 11:49:01.814186 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:36920->10.217.0.193:8775: read: connection reset by peer" Oct 01 11:49:01 crc kubenswrapper[4669]: I1001 11:49:01.863516 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:49:01 crc kubenswrapper[4669]: I1001 11:49:01.863598 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.381838 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.405321 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzjp2\" (UniqueName: \"kubernetes.io/projected/285b539b-1b0c-4bb1-a197-ca28afe29810-kube-api-access-hzjp2\") pod \"285b539b-1b0c-4bb1-a197-ca28afe29810\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.405639 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285b539b-1b0c-4bb1-a197-ca28afe29810-logs\") pod \"285b539b-1b0c-4bb1-a197-ca28afe29810\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.405677 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-combined-ca-bundle\") pod \"285b539b-1b0c-4bb1-a197-ca28afe29810\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.405730 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-config-data\") pod \"285b539b-1b0c-4bb1-a197-ca28afe29810\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.405773 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-nova-metadata-tls-certs\") pod \"285b539b-1b0c-4bb1-a197-ca28afe29810\" (UID: \"285b539b-1b0c-4bb1-a197-ca28afe29810\") " Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.407106 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/285b539b-1b0c-4bb1-a197-ca28afe29810-logs" (OuterVolumeSpecName: "logs") pod "285b539b-1b0c-4bb1-a197-ca28afe29810" (UID: "285b539b-1b0c-4bb1-a197-ca28afe29810"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.428165 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285b539b-1b0c-4bb1-a197-ca28afe29810-kube-api-access-hzjp2" (OuterVolumeSpecName: "kube-api-access-hzjp2") pod "285b539b-1b0c-4bb1-a197-ca28afe29810" (UID: "285b539b-1b0c-4bb1-a197-ca28afe29810"). InnerVolumeSpecName "kube-api-access-hzjp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.470299 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "285b539b-1b0c-4bb1-a197-ca28afe29810" (UID: "285b539b-1b0c-4bb1-a197-ca28afe29810"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.473501 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-config-data" (OuterVolumeSpecName: "config-data") pod "285b539b-1b0c-4bb1-a197-ca28afe29810" (UID: "285b539b-1b0c-4bb1-a197-ca28afe29810"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.490445 4669 generic.go:334] "Generic (PLEG): container finished" podID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerID="f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6" exitCode=0 Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.490503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"285b539b-1b0c-4bb1-a197-ca28afe29810","Type":"ContainerDied","Data":"f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6"} Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.490538 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"285b539b-1b0c-4bb1-a197-ca28afe29810","Type":"ContainerDied","Data":"0a26f6631df18abbc611063e92db5f0449f4ccc1aa90c288ae1bb9339c0dfbeb"} Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.490557 4669 scope.go:117] "RemoveContainer" containerID="f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.490711 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.508201 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.508244 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzjp2\" (UniqueName: \"kubernetes.io/projected/285b539b-1b0c-4bb1-a197-ca28afe29810-kube-api-access-hzjp2\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.508259 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285b539b-1b0c-4bb1-a197-ca28afe29810-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.508274 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.511424 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "285b539b-1b0c-4bb1-a197-ca28afe29810" (UID: "285b539b-1b0c-4bb1-a197-ca28afe29810"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.566300 4669 scope.go:117] "RemoveContainer" containerID="0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.606353 4669 scope.go:117] "RemoveContainer" containerID="f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6" Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.607051 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6\": container with ID starting with f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6 not found: ID does not exist" containerID="f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.607222 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6"} err="failed to get container status \"f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6\": rpc error: code = NotFound desc = could not find container \"f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6\": container with ID starting with f87d6412a8753877d660c3d009eaea7cf1bc25e8f484b5bf9dbcd0c999e743e6 not found: ID does not exist" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.607262 4669 scope.go:117] "RemoveContainer" containerID="0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3" Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.607880 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3\": container with ID starting with 0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3 not found: ID does not exist" containerID="0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.607938 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3"} err="failed to get container status \"0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3\": rpc error: code = NotFound desc = could not find container \"0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3\": container with ID starting with 0129da68543146cfd1b11670ad29b601dc5cd1ae743d3755e04c93c8f40c9cb3 not found: ID does not exist" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.613941 4669 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/285b539b-1b0c-4bb1-a197-ca28afe29810-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.830195 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.837970 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.857390 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.869270 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-log" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869326 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-log" Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.869356 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694af0ac-d829-4caa-8350-1861400d0438" containerName="nova-manage" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869362 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="694af0ac-d829-4caa-8350-1861400d0438" containerName="nova-manage" Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.869390 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerName="dnsmasq-dns" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869397 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerName="dnsmasq-dns" Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.869416 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerName="init" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869423 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerName="init" Oct 01 11:49:02 crc kubenswrapper[4669]: E1001 11:49:02.869476 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-metadata" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869483 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-metadata" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869890 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ba0e6c3-7c71-45df-acc3-127bec5afd42" containerName="dnsmasq-dns" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869906 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-metadata" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869933 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="694af0ac-d829-4caa-8350-1861400d0438" containerName="nova-manage" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.869959 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" containerName="nova-metadata-log" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.876683 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.876862 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.882868 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 11:49:02 crc kubenswrapper[4669]: I1001 11:49:02.883376 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.023825 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.023878 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-config-data\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.024467 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87kxx\" (UniqueName: \"kubernetes.io/projected/80be53d5-3338-467a-9be5-779722416d52-kube-api-access-87kxx\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.024540 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.024557 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80be53d5-3338-467a-9be5-779722416d52-logs\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.126068 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87kxx\" (UniqueName: \"kubernetes.io/projected/80be53d5-3338-467a-9be5-779722416d52-kube-api-access-87kxx\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.126197 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.126236 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80be53d5-3338-467a-9be5-779722416d52-logs\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.126333 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.126360 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-config-data\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.127124 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80be53d5-3338-467a-9be5-779722416d52-logs\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.130891 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.131843 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.140445 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80be53d5-3338-467a-9be5-779722416d52-config-data\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.151681 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87kxx\" (UniqueName: \"kubernetes.io/projected/80be53d5-3338-467a-9be5-779722416d52-kube-api-access-87kxx\") pod \"nova-metadata-0\" (UID: \"80be53d5-3338-467a-9be5-779722416d52\") " pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.255495 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.464931 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.517651 4669 generic.go:334] "Generic (PLEG): container finished" podID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerID="0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4" exitCode=0 Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.517734 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a472690-2229-4a59-9ba7-8bd28109cb5c","Type":"ContainerDied","Data":"0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4"} Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.517768 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a472690-2229-4a59-9ba7-8bd28109cb5c","Type":"ContainerDied","Data":"3a5ffedccec3a50a97be00a96ee9f2efa8749a7adea54a126b559d554e1aa869"} Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.517788 4669 scope.go:117] "RemoveContainer" containerID="0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.517912 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.525485 4669 generic.go:334] "Generic (PLEG): container finished" podID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" containerID="de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3" exitCode=0 Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.525540 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9e402de8-e8f7-4c99-8a1a-58e95ef031f2","Type":"ContainerDied","Data":"de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3"} Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.536143 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8kxb\" (UniqueName: \"kubernetes.io/projected/6a472690-2229-4a59-9ba7-8bd28109cb5c-kube-api-access-f8kxb\") pod \"6a472690-2229-4a59-9ba7-8bd28109cb5c\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.536210 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-config-data\") pod \"6a472690-2229-4a59-9ba7-8bd28109cb5c\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.536355 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-public-tls-certs\") pod \"6a472690-2229-4a59-9ba7-8bd28109cb5c\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.536446 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-combined-ca-bundle\") pod \"6a472690-2229-4a59-9ba7-8bd28109cb5c\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.536554 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a472690-2229-4a59-9ba7-8bd28109cb5c-logs\") pod \"6a472690-2229-4a59-9ba7-8bd28109cb5c\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.536778 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-internal-tls-certs\") pod \"6a472690-2229-4a59-9ba7-8bd28109cb5c\" (UID: \"6a472690-2229-4a59-9ba7-8bd28109cb5c\") " Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.537529 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a472690-2229-4a59-9ba7-8bd28109cb5c-logs" (OuterVolumeSpecName: "logs") pod "6a472690-2229-4a59-9ba7-8bd28109cb5c" (UID: "6a472690-2229-4a59-9ba7-8bd28109cb5c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.541200 4669 scope.go:117] "RemoveContainer" containerID="06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.543195 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a472690-2229-4a59-9ba7-8bd28109cb5c-kube-api-access-f8kxb" (OuterVolumeSpecName: "kube-api-access-f8kxb") pod "6a472690-2229-4a59-9ba7-8bd28109cb5c" (UID: "6a472690-2229-4a59-9ba7-8bd28109cb5c"). InnerVolumeSpecName "kube-api-access-f8kxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.565698 4669 scope.go:117] "RemoveContainer" containerID="0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4" Oct 01 11:49:03 crc kubenswrapper[4669]: E1001 11:49:03.568709 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4\": container with ID starting with 0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4 not found: ID does not exist" containerID="0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.568762 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4"} err="failed to get container status \"0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4\": rpc error: code = NotFound desc = could not find container \"0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4\": container with ID starting with 0f200797c8d944929c298d78b9c79c60a2b71b21fe9b8a6e12f7a0dc8315fdd4 not found: ID does not exist" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.568856 4669 scope.go:117] "RemoveContainer" containerID="06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490" Oct 01 11:49:03 crc kubenswrapper[4669]: E1001 11:49:03.574456 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490\": container with ID starting with 06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490 not found: ID does not exist" containerID="06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.574486 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490"} err="failed to get container status \"06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490\": rpc error: code = NotFound desc = could not find container \"06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490\": container with ID starting with 06d11d3cb1c22d0f39d4639e4e09c33f9214d6eec49a12aff4148e34b9e5f490 not found: ID does not exist" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.578324 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-config-data" (OuterVolumeSpecName: "config-data") pod "6a472690-2229-4a59-9ba7-8bd28109cb5c" (UID: "6a472690-2229-4a59-9ba7-8bd28109cb5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.579968 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a472690-2229-4a59-9ba7-8bd28109cb5c" (UID: "6a472690-2229-4a59-9ba7-8bd28109cb5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.601857 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a472690-2229-4a59-9ba7-8bd28109cb5c" (UID: "6a472690-2229-4a59-9ba7-8bd28109cb5c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:03 crc kubenswrapper[4669]: I1001 11:49:03.613052 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a472690-2229-4a59-9ba7-8bd28109cb5c" (UID: "6a472690-2229-4a59-9ba7-8bd28109cb5c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.639292 4669 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.639335 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8kxb\" (UniqueName: \"kubernetes.io/projected/6a472690-2229-4a59-9ba7-8bd28109cb5c-kube-api-access-f8kxb\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.639357 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.639370 4669 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.639384 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a472690-2229-4a59-9ba7-8bd28109cb5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.639396 4669 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a472690-2229-4a59-9ba7-8bd28109cb5c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.656276 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285b539b-1b0c-4bb1-a197-ca28afe29810" path="/var/lib/kubelet/pods/285b539b-1b0c-4bb1-a197-ca28afe29810/volumes" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.788266 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: W1001 11:49:03.799291 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80be53d5_3338_467a_9be5_779722416d52.slice/crio-bd8bca03ef0b4f68cca2a71ba9330f30e9ddd371a0d93a9ca5a88b528dfb2093 WatchSource:0}: Error finding container bd8bca03ef0b4f68cca2a71ba9330f30e9ddd371a0d93a9ca5a88b528dfb2093: Status 404 returned error can't find the container with id bd8bca03ef0b4f68cca2a71ba9330f30e9ddd371a0d93a9ca5a88b528dfb2093 Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.815317 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.850176 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-combined-ca-bundle\") pod \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.850291 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x86vm\" (UniqueName: \"kubernetes.io/projected/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-kube-api-access-x86vm\") pod \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.850321 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-config-data\") pod \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\" (UID: \"9e402de8-e8f7-4c99-8a1a-58e95ef031f2\") " Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.855446 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-kube-api-access-x86vm" (OuterVolumeSpecName: "kube-api-access-x86vm") pod "9e402de8-e8f7-4c99-8a1a-58e95ef031f2" (UID: "9e402de8-e8f7-4c99-8a1a-58e95ef031f2"). InnerVolumeSpecName "kube-api-access-x86vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.891337 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.911492 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.915486 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e402de8-e8f7-4c99-8a1a-58e95ef031f2" (UID: "9e402de8-e8f7-4c99-8a1a-58e95ef031f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.919855 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: E1001 11:49:03.920930 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-log" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.920954 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-log" Oct 01 11:49:04 crc kubenswrapper[4669]: E1001 11:49:03.920987 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" containerName="nova-scheduler-scheduler" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.920997 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" containerName="nova-scheduler-scheduler" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.922050 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-config-data" (OuterVolumeSpecName: "config-data") pod "9e402de8-e8f7-4c99-8a1a-58e95ef031f2" (UID: "9e402de8-e8f7-4c99-8a1a-58e95ef031f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:04 crc kubenswrapper[4669]: E1001 11:49:03.927736 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-api" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.927786 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-api" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.928277 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-api" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.928293 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" containerName="nova-api-log" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.928324 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" containerName="nova-scheduler-scheduler" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.938962 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.943990 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.944519 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.944902 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.945111 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.978608 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.978653 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x86vm\" (UniqueName: \"kubernetes.io/projected/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-kube-api-access-x86vm\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:03.978681 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e402de8-e8f7-4c99-8a1a-58e95ef031f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.081126 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-config-data\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.081201 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-public-tls-certs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.081478 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.081629 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mjh\" (UniqueName: \"kubernetes.io/projected/b39855ee-c66e-4f78-8128-a0149c9431da-kube-api-access-k8mjh\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.081689 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39855ee-c66e-4f78-8128-a0149c9431da-logs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.081741 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.186063 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-config-data\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.187148 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-public-tls-certs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.187223 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.187283 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mjh\" (UniqueName: \"kubernetes.io/projected/b39855ee-c66e-4f78-8128-a0149c9431da-kube-api-access-k8mjh\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.187648 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39855ee-c66e-4f78-8128-a0149c9431da-logs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.187686 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.188297 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39855ee-c66e-4f78-8128-a0149c9431da-logs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.196652 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-config-data\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.198031 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.199935 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.201315 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b39855ee-c66e-4f78-8128-a0149c9431da-public-tls-certs\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.212624 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mjh\" (UniqueName: \"kubernetes.io/projected/b39855ee-c66e-4f78-8128-a0149c9431da-kube-api-access-k8mjh\") pod \"nova-api-0\" (UID: \"b39855ee-c66e-4f78-8128-a0149c9431da\") " pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.436021 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.540542 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9e402de8-e8f7-4c99-8a1a-58e95ef031f2","Type":"ContainerDied","Data":"fbbeb3a8b9880d697ff39e17b00fb77a9b6312f91b5720d0807434a8e9bf669e"} Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.540623 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.540646 4669 scope.go:117] "RemoveContainer" containerID="de88488c5f98ad42a25a9fc5a6eed7582c8c86b9b7e8b0d39fcb0e78261995d3" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.544892 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80be53d5-3338-467a-9be5-779722416d52","Type":"ContainerStarted","Data":"914b86c1394a008aedab2c67788c14b032bd299d2433e7ad1d889efd1fa93970"} Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.545350 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80be53d5-3338-467a-9be5-779722416d52","Type":"ContainerStarted","Data":"c8a26fd9ade550403e20920a3d552e1da87adf326252702c75655eb6438ddc7e"} Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.545363 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80be53d5-3338-467a-9be5-779722416d52","Type":"ContainerStarted","Data":"bd8bca03ef0b4f68cca2a71ba9330f30e9ddd371a0d93a9ca5a88b528dfb2093"} Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.571544 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.571525976 podStartE2EDuration="2.571525976s" podCreationTimestamp="2025-10-01 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:49:04.569166128 +0000 UTC m=+1235.668731105" watchObservedRunningTime="2025-10-01 11:49:04.571525976 +0000 UTC m=+1235.671090953" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.606671 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.623513 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.649448 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.651753 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.654194 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.664547 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.704450 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ce3ac8-78ca-445e-acd1-995d99a5757a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.706342 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spjxk\" (UniqueName: \"kubernetes.io/projected/25ce3ac8-78ca-445e-acd1-995d99a5757a-kube-api-access-spjxk\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.706463 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ce3ac8-78ca-445e-acd1-995d99a5757a-config-data\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.808386 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ce3ac8-78ca-445e-acd1-995d99a5757a-config-data\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.808572 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ce3ac8-78ca-445e-acd1-995d99a5757a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.808603 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spjxk\" (UniqueName: \"kubernetes.io/projected/25ce3ac8-78ca-445e-acd1-995d99a5757a-kube-api-access-spjxk\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.814197 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ce3ac8-78ca-445e-acd1-995d99a5757a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.818056 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ce3ac8-78ca-445e-acd1-995d99a5757a-config-data\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.825685 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spjxk\" (UniqueName: \"kubernetes.io/projected/25ce3ac8-78ca-445e-acd1-995d99a5757a-kube-api-access-spjxk\") pod \"nova-scheduler-0\" (UID: \"25ce3ac8-78ca-445e-acd1-995d99a5757a\") " pod="openstack/nova-scheduler-0" Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.947502 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 11:49:04 crc kubenswrapper[4669]: I1001 11:49:04.970197 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 11:49:05 crc kubenswrapper[4669]: W1001 11:49:05.539191 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ce3ac8_78ca_445e_acd1_995d99a5757a.slice/crio-f50c317762773f5cbd72a6e3687e2ada36c871bd84133590f9904aaeef6c518a WatchSource:0}: Error finding container f50c317762773f5cbd72a6e3687e2ada36c871bd84133590f9904aaeef6c518a: Status 404 returned error can't find the container with id f50c317762773f5cbd72a6e3687e2ada36c871bd84133590f9904aaeef6c518a Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.542107 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.558059 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"25ce3ac8-78ca-445e-acd1-995d99a5757a","Type":"ContainerStarted","Data":"f50c317762773f5cbd72a6e3687e2ada36c871bd84133590f9904aaeef6c518a"} Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.563142 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b39855ee-c66e-4f78-8128-a0149c9431da","Type":"ContainerStarted","Data":"850bf6ced6d3ed83aee0f7d6020ac6de0a7db8e6cb90da1277a67318641da33d"} Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.563224 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b39855ee-c66e-4f78-8128-a0149c9431da","Type":"ContainerStarted","Data":"eee0f1a89a89e39dec83dcf1ee315c8e74e02eab8316e55c3ae5b05cce6f5a53"} Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.590139 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.590111106 podStartE2EDuration="2.590111106s" podCreationTimestamp="2025-10-01 11:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:49:05.584334314 +0000 UTC m=+1236.683899331" watchObservedRunningTime="2025-10-01 11:49:05.590111106 +0000 UTC m=+1236.689676093" Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.666956 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a472690-2229-4a59-9ba7-8bd28109cb5c" path="/var/lib/kubelet/pods/6a472690-2229-4a59-9ba7-8bd28109cb5c/volumes" Oct 01 11:49:05 crc kubenswrapper[4669]: I1001 11:49:05.667709 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e402de8-e8f7-4c99-8a1a-58e95ef031f2" path="/var/lib/kubelet/pods/9e402de8-e8f7-4c99-8a1a-58e95ef031f2/volumes" Oct 01 11:49:06 crc kubenswrapper[4669]: I1001 11:49:06.586240 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"25ce3ac8-78ca-445e-acd1-995d99a5757a","Type":"ContainerStarted","Data":"0d411278aa90670abda445ae422b7a55c022e54d2d82ac0c4950f82acd6738bc"} Oct 01 11:49:06 crc kubenswrapper[4669]: I1001 11:49:06.592910 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b39855ee-c66e-4f78-8128-a0149c9431da","Type":"ContainerStarted","Data":"eba3f2513e3be01f25bb29d98f455f2abd8645a44f7347460a656f5fb894f2e3"} Oct 01 11:49:06 crc kubenswrapper[4669]: I1001 11:49:06.622818 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.622779302 podStartE2EDuration="2.622779302s" podCreationTimestamp="2025-10-01 11:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:49:06.612158701 +0000 UTC m=+1237.711723698" watchObservedRunningTime="2025-10-01 11:49:06.622779302 +0000 UTC m=+1237.722344279" Oct 01 11:49:08 crc kubenswrapper[4669]: I1001 11:49:08.256583 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 11:49:08 crc kubenswrapper[4669]: I1001 11:49:08.257017 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 11:49:09 crc kubenswrapper[4669]: I1001 11:49:09.970744 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 11:49:13 crc kubenswrapper[4669]: I1001 11:49:13.255837 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 11:49:13 crc kubenswrapper[4669]: I1001 11:49:13.256402 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 11:49:14 crc kubenswrapper[4669]: I1001 11:49:14.271476 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80be53d5-3338-467a-9be5-779722416d52" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:49:14 crc kubenswrapper[4669]: I1001 11:49:14.272278 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80be53d5-3338-467a-9be5-779722416d52" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:49:14 crc kubenswrapper[4669]: I1001 11:49:14.437694 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:49:14 crc kubenswrapper[4669]: I1001 11:49:14.437751 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 11:49:14 crc kubenswrapper[4669]: I1001 11:49:14.971299 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 11:49:15 crc kubenswrapper[4669]: I1001 11:49:15.012545 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 11:49:15 crc kubenswrapper[4669]: I1001 11:49:15.454296 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b39855ee-c66e-4f78-8128-a0149c9431da" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:49:15 crc kubenswrapper[4669]: I1001 11:49:15.454340 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b39855ee-c66e-4f78-8128-a0149c9431da" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 11:49:15 crc kubenswrapper[4669]: I1001 11:49:15.763439 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 11:49:18 crc kubenswrapper[4669]: I1001 11:49:18.634423 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 11:49:23 crc kubenswrapper[4669]: I1001 11:49:23.263453 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 11:49:23 crc kubenswrapper[4669]: I1001 11:49:23.264373 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 11:49:23 crc kubenswrapper[4669]: I1001 11:49:23.269033 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 11:49:23 crc kubenswrapper[4669]: I1001 11:49:23.269946 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 11:49:24 crc kubenswrapper[4669]: I1001 11:49:24.449155 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 11:49:24 crc kubenswrapper[4669]: I1001 11:49:24.450640 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 11:49:24 crc kubenswrapper[4669]: I1001 11:49:24.451304 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 11:49:24 crc kubenswrapper[4669]: I1001 11:49:24.451385 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 11:49:24 crc kubenswrapper[4669]: I1001 11:49:24.468444 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 11:49:24 crc kubenswrapper[4669]: I1001 11:49:24.469949 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 11:49:31 crc kubenswrapper[4669]: I1001 11:49:31.864265 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:49:31 crc kubenswrapper[4669]: I1001 11:49:31.865306 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:49:31 crc kubenswrapper[4669]: I1001 11:49:31.865406 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:49:31 crc kubenswrapper[4669]: I1001 11:49:31.870146 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e52cf47b1ea2351c50bcd89b78dca4005cb050fc916aa94ef178ab99a189cf3"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:49:31 crc kubenswrapper[4669]: I1001 11:49:31.870311 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://7e52cf47b1ea2351c50bcd89b78dca4005cb050fc916aa94ef178ab99a189cf3" gracePeriod=600 Oct 01 11:49:32 crc kubenswrapper[4669]: I1001 11:49:32.378213 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:49:32 crc kubenswrapper[4669]: I1001 11:49:32.990695 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="7e52cf47b1ea2351c50bcd89b78dca4005cb050fc916aa94ef178ab99a189cf3" exitCode=0 Oct 01 11:49:32 crc kubenswrapper[4669]: I1001 11:49:32.990765 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"7e52cf47b1ea2351c50bcd89b78dca4005cb050fc916aa94ef178ab99a189cf3"} Oct 01 11:49:32 crc kubenswrapper[4669]: I1001 11:49:32.991033 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"7b1236276e91901ca356b23317942bcba8b16d3a037aab0000d2acb95db6570b"} Oct 01 11:49:32 crc kubenswrapper[4669]: I1001 11:49:32.991062 4669 scope.go:117] "RemoveContainer" containerID="ef1fa470dbb217bde08acd53a153a9e8382565310fe4c3c6cd2c78b6a193aa31" Oct 01 11:49:33 crc kubenswrapper[4669]: I1001 11:49:33.791477 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:49:37 crc kubenswrapper[4669]: I1001 11:49:37.040272 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="rabbitmq" containerID="cri-o://4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9" gracePeriod=604796 Oct 01 11:49:38 crc kubenswrapper[4669]: I1001 11:49:38.789228 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="rabbitmq" containerID="cri-o://b5c767a1b33375c7e3b5684bfa31b9b72e153b30e7bbe663710482868411b6fa" gracePeriod=604796 Oct 01 11:49:39 crc kubenswrapper[4669]: I1001 11:49:39.512036 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 01 11:49:40 crc kubenswrapper[4669]: I1001 11:49:40.146247 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.745554 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.926725 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-tls\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.926828 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500653c7-d0f6-46d5-9411-60a17569fdd3-erlang-cookie-secret\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.926878 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.926913 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-config-data\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.926969 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-server-conf\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.927015 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-confd\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.927182 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-plugins\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.927215 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l52c8\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-kube-api-access-l52c8\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.927282 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-erlang-cookie\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.927313 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500653c7-d0f6-46d5-9411-60a17569fdd3-pod-info\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.927330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-plugins-conf\") pod \"500653c7-d0f6-46d5-9411-60a17569fdd3\" (UID: \"500653c7-d0f6-46d5-9411-60a17569fdd3\") " Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.928364 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.929752 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.930005 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.937858 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-kube-api-access-l52c8" (OuterVolumeSpecName: "kube-api-access-l52c8") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "kube-api-access-l52c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.938118 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/500653c7-d0f6-46d5-9411-60a17569fdd3-pod-info" (OuterVolumeSpecName: "pod-info") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.938226 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.938592 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.959828 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500653c7-d0f6-46d5-9411-60a17569fdd3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:43 crc kubenswrapper[4669]: I1001 11:49:43.970374 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-config-data" (OuterVolumeSpecName: "config-data") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.004867 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-server-conf" (OuterVolumeSpecName: "server-conf") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031700 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031750 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l52c8\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-kube-api-access-l52c8\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031764 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031776 4669 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500653c7-d0f6-46d5-9411-60a17569fdd3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031787 4669 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031795 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031803 4669 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500653c7-d0f6-46d5-9411-60a17569fdd3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031836 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031849 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.031860 4669 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500653c7-d0f6-46d5-9411-60a17569fdd3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.057471 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.107483 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "500653c7-d0f6-46d5-9411-60a17569fdd3" (UID: "500653c7-d0f6-46d5-9411-60a17569fdd3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.133582 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500653c7-d0f6-46d5-9411-60a17569fdd3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.134194 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.170543 4669 generic.go:334] "Generic (PLEG): container finished" podID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerID="4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9" exitCode=0 Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.170604 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"500653c7-d0f6-46d5-9411-60a17569fdd3","Type":"ContainerDied","Data":"4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9"} Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.170620 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.170643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"500653c7-d0f6-46d5-9411-60a17569fdd3","Type":"ContainerDied","Data":"2a403e600710d456e7a10d42b82b0ac1d27de0eca030353e1597e302cfd29dfd"} Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.170666 4669 scope.go:117] "RemoveContainer" containerID="4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.200884 4669 scope.go:117] "RemoveContainer" containerID="f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.218183 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.229857 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.260262 4669 scope.go:117] "RemoveContainer" containerID="4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.260641 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:49:44 crc kubenswrapper[4669]: E1001 11:49:44.261141 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="setup-container" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.261163 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="setup-container" Oct 01 11:49:44 crc kubenswrapper[4669]: E1001 11:49:44.261184 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="rabbitmq" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.261192 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="rabbitmq" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.261402 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" containerName="rabbitmq" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.262563 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: E1001 11:49:44.266506 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9\": container with ID starting with 4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9 not found: ID does not exist" containerID="4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.266554 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9"} err="failed to get container status \"4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9\": rpc error: code = NotFound desc = could not find container \"4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9\": container with ID starting with 4ea6842b41bed31e4d74bb2ff1dc5f2f9a30c13915b6156ce2f822931bad0fe9 not found: ID does not exist" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.266589 4669 scope.go:117] "RemoveContainer" containerID="f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.266606 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.266893 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.267095 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.267139 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.267263 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vq4fz" Oct 01 11:49:44 crc kubenswrapper[4669]: E1001 11:49:44.267273 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c\": container with ID starting with f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c not found: ID does not exist" containerID="f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.267331 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c"} err="failed to get container status \"f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c\": rpc error: code = NotFound desc = could not find container \"f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c\": container with ID starting with f34755d247e4a2fd16f8e30568675a6987f04938ce27af10ae051eebb2f7fd8c not found: ID does not exist" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.267355 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.267398 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.294057 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459059 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459319 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459360 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459467 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459566 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/352c2b88-bf96-4858-b166-d5655b36b2b0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459582 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459670 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459695 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-config-data\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459741 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xs72\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-kube-api-access-7xs72\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459767 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/352c2b88-bf96-4858-b166-d5655b36b2b0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.459795 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562719 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562789 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562839 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562884 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/352c2b88-bf96-4858-b166-d5655b36b2b0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562906 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562954 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.562995 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-config-data\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.563026 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xs72\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-kube-api-access-7xs72\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.563055 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/352c2b88-bf96-4858-b166-d5655b36b2b0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.563103 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.563182 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.564804 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.565587 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.566821 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-config-data\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.566866 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.566880 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.567218 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/352c2b88-bf96-4858-b166-d5655b36b2b0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.568511 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.570434 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/352c2b88-bf96-4858-b166-d5655b36b2b0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.584161 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.587723 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/352c2b88-bf96-4858-b166-d5655b36b2b0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.597747 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xs72\" (UniqueName: \"kubernetes.io/projected/352c2b88-bf96-4858-b166-d5655b36b2b0-kube-api-access-7xs72\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.610749 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"352c2b88-bf96-4858-b166-d5655b36b2b0\") " pod="openstack/rabbitmq-server-0" Oct 01 11:49:44 crc kubenswrapper[4669]: I1001 11:49:44.648491 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.175722 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 11:49:45 crc kubenswrapper[4669]: W1001 11:49:45.201159 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod352c2b88_bf96_4858_b166_d5655b36b2b0.slice/crio-f68839f8c4553dfc67d14f5e74f522aa2b404c36ef5a9b58d145aae8e3fe0a24 WatchSource:0}: Error finding container f68839f8c4553dfc67d14f5e74f522aa2b404c36ef5a9b58d145aae8e3fe0a24: Status 404 returned error can't find the container with id f68839f8c4553dfc67d14f5e74f522aa2b404c36ef5a9b58d145aae8e3fe0a24 Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.217427 4669 generic.go:334] "Generic (PLEG): container finished" podID="4619f705-9393-48c8-bc69-2d6183546af2" containerID="b5c767a1b33375c7e3b5684bfa31b9b72e153b30e7bbe663710482868411b6fa" exitCode=0 Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.217586 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4619f705-9393-48c8-bc69-2d6183546af2","Type":"ContainerDied","Data":"b5c767a1b33375c7e3b5684bfa31b9b72e153b30e7bbe663710482868411b6fa"} Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.588962 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.668280 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500653c7-d0f6-46d5-9411-60a17569fdd3" path="/var/lib/kubelet/pods/500653c7-d0f6-46d5-9411-60a17569fdd3/volumes" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788431 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4619f705-9393-48c8-bc69-2d6183546af2-erlang-cookie-secret\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788552 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-server-conf\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788654 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-plugins\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788738 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-plugins-conf\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788817 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-config-data\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788877 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq558\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-kube-api-access-gq558\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788927 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-erlang-cookie\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.788972 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4619f705-9393-48c8-bc69-2d6183546af2-pod-info\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.789003 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-tls\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.789028 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.789110 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-confd\") pod \"4619f705-9393-48c8-bc69-2d6183546af2\" (UID: \"4619f705-9393-48c8-bc69-2d6183546af2\") " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.789211 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.789537 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.789679 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.790006 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.790022 4669 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.790031 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.795722 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.796586 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4619f705-9393-48c8-bc69-2d6183546af2-pod-info" (OuterVolumeSpecName: "pod-info") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.796625 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-kube-api-access-gq558" (OuterVolumeSpecName: "kube-api-access-gq558") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "kube-api-access-gq558". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.801193 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4619f705-9393-48c8-bc69-2d6183546af2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.805372 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.835027 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-config-data" (OuterVolumeSpecName: "config-data") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.848886 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-server-conf" (OuterVolumeSpecName: "server-conf") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894336 4669 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4619f705-9393-48c8-bc69-2d6183546af2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894389 4669 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894404 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4619f705-9393-48c8-bc69-2d6183546af2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894417 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq558\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-kube-api-access-gq558\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894433 4669 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4619f705-9393-48c8-bc69-2d6183546af2-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894446 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.894493 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.915292 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4619f705-9393-48c8-bc69-2d6183546af2" (UID: "4619f705-9393-48c8-bc69-2d6183546af2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.922481 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.997044 4669 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4619f705-9393-48c8-bc69-2d6183546af2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:45 crc kubenswrapper[4669]: I1001 11:49:45.997109 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.239966 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"352c2b88-bf96-4858-b166-d5655b36b2b0","Type":"ContainerStarted","Data":"f68839f8c4553dfc67d14f5e74f522aa2b404c36ef5a9b58d145aae8e3fe0a24"} Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.242870 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4619f705-9393-48c8-bc69-2d6183546af2","Type":"ContainerDied","Data":"036559d558945151b60ce4d18cb5d38688b0ad17d7e90e5898bb61e6b0c23e8c"} Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.242929 4669 scope.go:117] "RemoveContainer" containerID="b5c767a1b33375c7e3b5684bfa31b9b72e153b30e7bbe663710482868411b6fa" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.243156 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.278306 4669 scope.go:117] "RemoveContainer" containerID="809ebe8a7f9b3cd52ba8893dd5e5d7f364e22cda0ade7a5b0d6d5c665aced5b6" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.290932 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.308984 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.327795 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:49:46 crc kubenswrapper[4669]: E1001 11:49:46.344614 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="rabbitmq" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.344666 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="rabbitmq" Oct 01 11:49:46 crc kubenswrapper[4669]: E1001 11:49:46.344719 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="setup-container" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.344727 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="setup-container" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.346564 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4619f705-9393-48c8-bc69-2d6183546af2" containerName="rabbitmq" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.348588 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.351334 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.351718 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.355423 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4gvdd" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.355573 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.355734 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.356214 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.356428 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.356667 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.509952 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510395 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510452 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510506 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510539 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510617 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510658 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510687 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmfcq\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-kube-api-access-jmfcq\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510820 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510859 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.510885 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612666 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612725 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612763 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612819 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612851 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612891 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.612968 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.613027 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmfcq\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-kube-api-access-jmfcq\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.613127 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.613146 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.613180 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.617155 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.618017 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.618364 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.619923 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.620372 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.627423 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.627565 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.628597 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.629234 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.630989 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.642492 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmfcq\" (UniqueName: \"kubernetes.io/projected/f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e-kube-api-access-jmfcq\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.678755 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.714965 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.881988 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-sqvrv"] Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.888453 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.895436 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 11:49:46 crc kubenswrapper[4669]: I1001 11:49:46.897368 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-sqvrv"] Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027493 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-config\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027561 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027601 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027640 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027689 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027722 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5mzz\" (UniqueName: \"kubernetes.io/projected/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-kube-api-access-t5mzz\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.027757 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-svc\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130215 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-config\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130309 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130374 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130499 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130575 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130617 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5mzz\" (UniqueName: \"kubernetes.io/projected/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-kube-api-access-t5mzz\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.130660 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-svc\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.132227 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.132277 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.132673 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-svc\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.133111 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.133144 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-config\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.133623 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.152958 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5mzz\" (UniqueName: \"kubernetes.io/projected/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-kube-api-access-t5mzz\") pod \"dnsmasq-dns-67b789f86c-sqvrv\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.227235 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.281264 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"352c2b88-bf96-4858-b166-d5655b36b2b0","Type":"ContainerStarted","Data":"d1c9c0435b596dfc4346b2c000ee5beea06a7768fc159ad21915ed3b5a9af5d6"} Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.401387 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.656809 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4619f705-9393-48c8-bc69-2d6183546af2" path="/var/lib/kubelet/pods/4619f705-9393-48c8-bc69-2d6183546af2/volumes" Oct 01 11:49:47 crc kubenswrapper[4669]: I1001 11:49:47.809933 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-sqvrv"] Oct 01 11:49:48 crc kubenswrapper[4669]: I1001 11:49:48.301384 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e","Type":"ContainerStarted","Data":"7005d1719dbf61efe109cf4bf867e7b924fb41e10af2588cd59e928f435def1d"} Oct 01 11:49:48 crc kubenswrapper[4669]: I1001 11:49:48.303816 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerID="024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4" exitCode=0 Oct 01 11:49:48 crc kubenswrapper[4669]: I1001 11:49:48.305216 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" event={"ID":"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe","Type":"ContainerDied","Data":"024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4"} Oct 01 11:49:48 crc kubenswrapper[4669]: I1001 11:49:48.305255 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" event={"ID":"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe","Type":"ContainerStarted","Data":"169e17f70f4675aadd3a0c516b64479ad65a38b01170e631f1c54668a3c1384d"} Oct 01 11:49:49 crc kubenswrapper[4669]: I1001 11:49:49.316384 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" event={"ID":"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe","Type":"ContainerStarted","Data":"94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e"} Oct 01 11:49:49 crc kubenswrapper[4669]: I1001 11:49:49.316696 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:49 crc kubenswrapper[4669]: I1001 11:49:49.337604 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" podStartSLOduration=3.337574732 podStartE2EDuration="3.337574732s" podCreationTimestamp="2025-10-01 11:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:49:49.335292227 +0000 UTC m=+1280.434857254" watchObservedRunningTime="2025-10-01 11:49:49.337574732 +0000 UTC m=+1280.437139749" Oct 01 11:49:50 crc kubenswrapper[4669]: I1001 11:49:50.332285 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e","Type":"ContainerStarted","Data":"318f9d3abcf804ec1daae34482ad05ef37ef7649253156f785af1d41dfe52e9d"} Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.230415 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.331574 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gltl2"] Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.331903 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerName="dnsmasq-dns" containerID="cri-o://cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc" gracePeriod=10 Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.556436 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-bkd88"] Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.569514 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.590027 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-bkd88"] Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.592685 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.592781 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.592815 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpzf2\" (UniqueName: \"kubernetes.io/projected/9d9999e8-41a9-4930-b113-7f135640c123-kube-api-access-xpzf2\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.592915 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-config\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.592948 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.592976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.593015 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.695353 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.695827 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.695944 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpzf2\" (UniqueName: \"kubernetes.io/projected/9d9999e8-41a9-4930-b113-7f135640c123-kube-api-access-xpzf2\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.696183 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-config\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.696293 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.696389 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.696501 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.698939 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.700249 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.701121 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.701132 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.703360 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.703472 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9999e8-41a9-4930-b113-7f135640c123-config\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.721748 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpzf2\" (UniqueName: \"kubernetes.io/projected/9d9999e8-41a9-4930-b113-7f135640c123-kube-api-access-xpzf2\") pod \"dnsmasq-dns-cb6ffcf87-bkd88\" (UID: \"9d9999e8-41a9-4930-b113-7f135640c123\") " pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.895649 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:49:57 crc kubenswrapper[4669]: I1001 11:49:57.909179 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.004180 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-config\") pod \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.004234 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6gcm\" (UniqueName: \"kubernetes.io/projected/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-kube-api-access-z6gcm\") pod \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.004313 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-svc\") pod \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.004557 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-swift-storage-0\") pod \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.004607 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-nb\") pod \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.004688 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-sb\") pod \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\" (UID: \"ddc3cccf-6f89-44ac-a85c-8ab53a95493b\") " Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.010372 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-kube-api-access-z6gcm" (OuterVolumeSpecName: "kube-api-access-z6gcm") pod "ddc3cccf-6f89-44ac-a85c-8ab53a95493b" (UID: "ddc3cccf-6f89-44ac-a85c-8ab53a95493b"). InnerVolumeSpecName "kube-api-access-z6gcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.070369 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ddc3cccf-6f89-44ac-a85c-8ab53a95493b" (UID: "ddc3cccf-6f89-44ac-a85c-8ab53a95493b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.070416 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ddc3cccf-6f89-44ac-a85c-8ab53a95493b" (UID: "ddc3cccf-6f89-44ac-a85c-8ab53a95493b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.083251 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-config" (OuterVolumeSpecName: "config") pod "ddc3cccf-6f89-44ac-a85c-8ab53a95493b" (UID: "ddc3cccf-6f89-44ac-a85c-8ab53a95493b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.098209 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddc3cccf-6f89-44ac-a85c-8ab53a95493b" (UID: "ddc3cccf-6f89-44ac-a85c-8ab53a95493b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.098319 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ddc3cccf-6f89-44ac-a85c-8ab53a95493b" (UID: "ddc3cccf-6f89-44ac-a85c-8ab53a95493b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.107785 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.107820 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.107832 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6gcm\" (UniqueName: \"kubernetes.io/projected/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-kube-api-access-z6gcm\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.107846 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.107854 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.107862 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddc3cccf-6f89-44ac-a85c-8ab53a95493b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.388577 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-bkd88"] Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.442013 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" event={"ID":"9d9999e8-41a9-4930-b113-7f135640c123","Type":"ContainerStarted","Data":"fbfb559e1201ecdd299a3b7acd9eda28f93dd715c1500e1c9011db6a6e431331"} Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.444446 4669 generic.go:334] "Generic (PLEG): container finished" podID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerID="cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc" exitCode=0 Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.444503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" event={"ID":"ddc3cccf-6f89-44ac-a85c-8ab53a95493b","Type":"ContainerDied","Data":"cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc"} Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.444523 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" event={"ID":"ddc3cccf-6f89-44ac-a85c-8ab53a95493b","Type":"ContainerDied","Data":"1c513d5fc1a6693d07435aa9e85e5082e47dc9c732069cb5f6a8048456306fb3"} Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.444542 4669 scope.go:117] "RemoveContainer" containerID="cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.444719 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gltl2" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.593268 4669 scope.go:117] "RemoveContainer" containerID="5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.627439 4669 scope.go:117] "RemoveContainer" containerID="cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc" Oct 01 11:49:58 crc kubenswrapper[4669]: E1001 11:49:58.628260 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc\": container with ID starting with cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc not found: ID does not exist" containerID="cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.628336 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc"} err="failed to get container status \"cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc\": rpc error: code = NotFound desc = could not find container \"cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc\": container with ID starting with cc0de991031c2f18eb6495f92afe0bfecfebcae706c0c332f5f0f290fd07f0bc not found: ID does not exist" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.628367 4669 scope.go:117] "RemoveContainer" containerID="5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296" Oct 01 11:49:58 crc kubenswrapper[4669]: E1001 11:49:58.628879 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296\": container with ID starting with 5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296 not found: ID does not exist" containerID="5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.628920 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296"} err="failed to get container status \"5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296\": rpc error: code = NotFound desc = could not find container \"5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296\": container with ID starting with 5e307accf29ddb5bb2a4f31d428ee861817ee0c31d692523fc82eff7923bf296 not found: ID does not exist" Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.639678 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gltl2"] Oct 01 11:49:58 crc kubenswrapper[4669]: I1001 11:49:58.653588 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gltl2"] Oct 01 11:49:59 crc kubenswrapper[4669]: I1001 11:49:59.461672 4669 generic.go:334] "Generic (PLEG): container finished" podID="9d9999e8-41a9-4930-b113-7f135640c123" containerID="cd801d455a3c3417f74fcc93548df58f7c034b847274706f3f675288f564152c" exitCode=0 Oct 01 11:49:59 crc kubenswrapper[4669]: I1001 11:49:59.462144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" event={"ID":"9d9999e8-41a9-4930-b113-7f135640c123","Type":"ContainerDied","Data":"cd801d455a3c3417f74fcc93548df58f7c034b847274706f3f675288f564152c"} Oct 01 11:49:59 crc kubenswrapper[4669]: I1001 11:49:59.670718 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" path="/var/lib/kubelet/pods/ddc3cccf-6f89-44ac-a85c-8ab53a95493b/volumes" Oct 01 11:50:00 crc kubenswrapper[4669]: I1001 11:50:00.489483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" event={"ID":"9d9999e8-41a9-4930-b113-7f135640c123","Type":"ContainerStarted","Data":"e0479ac0d473f991fef026fe545336fffd2df5b41b465ac68ea5b386d1ee199e"} Oct 01 11:50:00 crc kubenswrapper[4669]: I1001 11:50:00.489737 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:50:00 crc kubenswrapper[4669]: I1001 11:50:00.517875 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" podStartSLOduration=3.517844678 podStartE2EDuration="3.517844678s" podCreationTimestamp="2025-10-01 11:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:50:00.510948959 +0000 UTC m=+1291.610513946" watchObservedRunningTime="2025-10-01 11:50:00.517844678 +0000 UTC m=+1291.617409665" Oct 01 11:50:07 crc kubenswrapper[4669]: I1001 11:50:07.911625 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-bkd88" Oct 01 11:50:07 crc kubenswrapper[4669]: I1001 11:50:07.992216 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-sqvrv"] Oct 01 11:50:07 crc kubenswrapper[4669]: I1001 11:50:07.992633 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerName="dnsmasq-dns" containerID="cri-o://94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e" gracePeriod=10 Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.565023 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.589643 4669 generic.go:334] "Generic (PLEG): container finished" podID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerID="94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e" exitCode=0 Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.589725 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" event={"ID":"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe","Type":"ContainerDied","Data":"94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e"} Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.589807 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.589837 4669 scope.go:117] "RemoveContainer" containerID="94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.589820 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-sqvrv" event={"ID":"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe","Type":"ContainerDied","Data":"169e17f70f4675aadd3a0c516b64479ad65a38b01170e631f1c54668a3c1384d"} Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.615527 4669 scope.go:117] "RemoveContainer" containerID="024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.647833 4669 scope.go:117] "RemoveContainer" containerID="94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e" Oct 01 11:50:08 crc kubenswrapper[4669]: E1001 11:50:08.648511 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e\": container with ID starting with 94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e not found: ID does not exist" containerID="94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.648571 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e"} err="failed to get container status \"94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e\": rpc error: code = NotFound desc = could not find container \"94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e\": container with ID starting with 94d324bd2941dd4d6d10f2d090025a08be39c80515b285fd2123d355516d323e not found: ID does not exist" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.648610 4669 scope.go:117] "RemoveContainer" containerID="024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4" Oct 01 11:50:08 crc kubenswrapper[4669]: E1001 11:50:08.649065 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4\": container with ID starting with 024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4 not found: ID does not exist" containerID="024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.649141 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4"} err="failed to get container status \"024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4\": rpc error: code = NotFound desc = could not find container \"024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4\": container with ID starting with 024a3cef79654e23365f4ad66fd14a9245acb3ad993061f25bfed0fa3efb20c4 not found: ID does not exist" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714511 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5mzz\" (UniqueName: \"kubernetes.io/projected/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-kube-api-access-t5mzz\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714671 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-swift-storage-0\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714714 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-sb\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714776 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-openstack-edpm-ipam\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714821 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-nb\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714845 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-config\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.714939 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-svc\") pod \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\" (UID: \"fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe\") " Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.727394 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-kube-api-access-t5mzz" (OuterVolumeSpecName: "kube-api-access-t5mzz") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "kube-api-access-t5mzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.780636 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.789766 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.803800 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.805958 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.806276 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-config" (OuterVolumeSpecName: "config") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.810032 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" (UID: "fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.817586 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.818106 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.818149 4669 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.818166 4669 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.818183 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5mzz\" (UniqueName: \"kubernetes.io/projected/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-kube-api-access-t5mzz\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.818200 4669 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.818217 4669 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.948250 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-sqvrv"] Oct 01 11:50:08 crc kubenswrapper[4669]: I1001 11:50:08.957171 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-sqvrv"] Oct 01 11:50:09 crc kubenswrapper[4669]: I1001 11:50:09.660005 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" path="/var/lib/kubelet/pods/fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe/volumes" Oct 01 11:50:19 crc kubenswrapper[4669]: I1001 11:50:19.714791 4669 generic.go:334] "Generic (PLEG): container finished" podID="352c2b88-bf96-4858-b166-d5655b36b2b0" containerID="d1c9c0435b596dfc4346b2c000ee5beea06a7768fc159ad21915ed3b5a9af5d6" exitCode=0 Oct 01 11:50:19 crc kubenswrapper[4669]: I1001 11:50:19.714906 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"352c2b88-bf96-4858-b166-d5655b36b2b0","Type":"ContainerDied","Data":"d1c9c0435b596dfc4346b2c000ee5beea06a7768fc159ad21915ed3b5a9af5d6"} Oct 01 11:50:20 crc kubenswrapper[4669]: I1001 11:50:20.737359 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"352c2b88-bf96-4858-b166-d5655b36b2b0","Type":"ContainerStarted","Data":"87aca2c20e08a723986c1b77bbc4ca9954884830258b7dde80975ab1a8e16fda"} Oct 01 11:50:20 crc kubenswrapper[4669]: I1001 11:50:20.738299 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 11:50:20 crc kubenswrapper[4669]: I1001 11:50:20.780821 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.780783675 podStartE2EDuration="36.780783675s" podCreationTimestamp="2025-10-01 11:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:50:20.765969661 +0000 UTC m=+1311.865534668" watchObservedRunningTime="2025-10-01 11:50:20.780783675 +0000 UTC m=+1311.880348682" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.382808 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8"] Oct 01 11:50:21 crc kubenswrapper[4669]: E1001 11:50:21.383418 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerName="init" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.383443 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerName="init" Oct 01 11:50:21 crc kubenswrapper[4669]: E1001 11:50:21.383500 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerName="dnsmasq-dns" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.383510 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerName="dnsmasq-dns" Oct 01 11:50:21 crc kubenswrapper[4669]: E1001 11:50:21.383525 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerName="dnsmasq-dns" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.383533 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerName="dnsmasq-dns" Oct 01 11:50:21 crc kubenswrapper[4669]: E1001 11:50:21.383543 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerName="init" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.383551 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerName="init" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.383794 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc3cccf-6f89-44ac-a85c-8ab53a95493b" containerName="dnsmasq-dns" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.383813 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa94d9bc-ee63-4b99-8e81-3e9a4c31a7fe" containerName="dnsmasq-dns" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.385602 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.390846 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.391248 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.391729 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.392776 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.415520 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8"] Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.524336 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.524434 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.524582 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.524626 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8hc\" (UniqueName: \"kubernetes.io/projected/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-kube-api-access-px8hc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.626834 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.626912 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8hc\" (UniqueName: \"kubernetes.io/projected/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-kube-api-access-px8hc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.627010 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.627046 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.636311 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.647057 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.648804 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.651742 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8hc\" (UniqueName: \"kubernetes.io/projected/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-kube-api-access-px8hc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:21 crc kubenswrapper[4669]: I1001 11:50:21.732533 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:22 crc kubenswrapper[4669]: I1001 11:50:22.358620 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8"] Oct 01 11:50:22 crc kubenswrapper[4669]: W1001 11:50:22.363091 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f131ccb_5e9b_4097_8abe_f10d6f2c9b52.slice/crio-3fb9b818db009d95b5cd2773ec397702b1c12240c80364c1746fcd4d4aa2c3b5 WatchSource:0}: Error finding container 3fb9b818db009d95b5cd2773ec397702b1c12240c80364c1746fcd4d4aa2c3b5: Status 404 returned error can't find the container with id 3fb9b818db009d95b5cd2773ec397702b1c12240c80364c1746fcd4d4aa2c3b5 Oct 01 11:50:22 crc kubenswrapper[4669]: I1001 11:50:22.367638 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:50:22 crc kubenswrapper[4669]: I1001 11:50:22.772882 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" event={"ID":"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52","Type":"ContainerStarted","Data":"3fb9b818db009d95b5cd2773ec397702b1c12240c80364c1746fcd4d4aa2c3b5"} Oct 01 11:50:22 crc kubenswrapper[4669]: I1001 11:50:22.776460 4669 generic.go:334] "Generic (PLEG): container finished" podID="f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e" containerID="318f9d3abcf804ec1daae34482ad05ef37ef7649253156f785af1d41dfe52e9d" exitCode=0 Oct 01 11:50:22 crc kubenswrapper[4669]: I1001 11:50:22.776512 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e","Type":"ContainerDied","Data":"318f9d3abcf804ec1daae34482ad05ef37ef7649253156f785af1d41dfe52e9d"} Oct 01 11:50:23 crc kubenswrapper[4669]: I1001 11:50:23.798970 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e","Type":"ContainerStarted","Data":"9f15275a92a74ddea10c05024bb1c059a377e8009d24d1b499ca24ce0f96cff0"} Oct 01 11:50:23 crc kubenswrapper[4669]: I1001 11:50:23.801183 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:50:23 crc kubenswrapper[4669]: I1001 11:50:23.831353 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.831327686 podStartE2EDuration="37.831327686s" podCreationTimestamp="2025-10-01 11:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:50:23.821669358 +0000 UTC m=+1314.921234355" watchObservedRunningTime="2025-10-01 11:50:23.831327686 +0000 UTC m=+1314.930892663" Oct 01 11:50:31 crc kubenswrapper[4669]: I1001 11:50:31.719587 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:50:32 crc kubenswrapper[4669]: I1001 11:50:32.920460 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" event={"ID":"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52","Type":"ContainerStarted","Data":"06681080b72a54e278b433b5b433b2acbb5784c962cd7c5a779ba2ff8487d05f"} Oct 01 11:50:32 crc kubenswrapper[4669]: I1001 11:50:32.953797 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" podStartSLOduration=2.607715934 podStartE2EDuration="11.953775245s" podCreationTimestamp="2025-10-01 11:50:21 +0000 UTC" firstStartedPulling="2025-10-01 11:50:22.36708692 +0000 UTC m=+1313.466651897" lastFinishedPulling="2025-10-01 11:50:31.713146221 +0000 UTC m=+1322.812711208" observedRunningTime="2025-10-01 11:50:32.951676103 +0000 UTC m=+1324.051241100" watchObservedRunningTime="2025-10-01 11:50:32.953775245 +0000 UTC m=+1324.053340222" Oct 01 11:50:34 crc kubenswrapper[4669]: I1001 11:50:34.654562 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 11:50:36 crc kubenswrapper[4669]: I1001 11:50:36.719393 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 11:50:44 crc kubenswrapper[4669]: I1001 11:50:44.091140 4669 generic.go:334] "Generic (PLEG): container finished" podID="3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" containerID="06681080b72a54e278b433b5b433b2acbb5784c962cd7c5a779ba2ff8487d05f" exitCode=0 Oct 01 11:50:44 crc kubenswrapper[4669]: I1001 11:50:44.092384 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" event={"ID":"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52","Type":"ContainerDied","Data":"06681080b72a54e278b433b5b433b2acbb5784c962cd7c5a779ba2ff8487d05f"} Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.600102 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.673266 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-ssh-key\") pod \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.673409 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px8hc\" (UniqueName: \"kubernetes.io/projected/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-kube-api-access-px8hc\") pod \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.673471 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory\") pod \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.673709 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-repo-setup-combined-ca-bundle\") pod \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.681690 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" (UID: "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.683912 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-kube-api-access-px8hc" (OuterVolumeSpecName: "kube-api-access-px8hc") pod "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" (UID: "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52"). InnerVolumeSpecName "kube-api-access-px8hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:50:45 crc kubenswrapper[4669]: E1001 11:50:45.710541 4669 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory podName:3f131ccb-5e9b-4097-8abe-f10d6f2c9b52 nodeName:}" failed. No retries permitted until 2025-10-01 11:50:46.210495753 +0000 UTC m=+1337.310060730 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory") pod "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" (UID: "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52") : error deleting /var/lib/kubelet/pods/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52/volume-subpaths: remove /var/lib/kubelet/pods/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52/volume-subpaths: no such file or directory Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.716193 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" (UID: "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.777109 4669 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.777163 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:45 crc kubenswrapper[4669]: I1001 11:50:45.777181 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px8hc\" (UniqueName: \"kubernetes.io/projected/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-kube-api-access-px8hc\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.118671 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" event={"ID":"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52","Type":"ContainerDied","Data":"3fb9b818db009d95b5cd2773ec397702b1c12240c80364c1746fcd4d4aa2c3b5"} Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.118747 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb9b818db009d95b5cd2773ec397702b1c12240c80364c1746fcd4d4aa2c3b5" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.118793 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.229373 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2"] Oct 01 11:50:46 crc kubenswrapper[4669]: E1001 11:50:46.230261 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.230285 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.230516 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.231310 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.258859 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2"] Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.291145 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory\") pod \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\" (UID: \"3f131ccb-5e9b-4097-8abe-f10d6f2c9b52\") " Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.295499 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory" (OuterVolumeSpecName: "inventory") pod "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52" (UID: "3f131ccb-5e9b-4097-8abe-f10d6f2c9b52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.393603 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.393834 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.393963 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brjq\" (UniqueName: \"kubernetes.io/projected/a422f4f8-7b2e-4f73-89e8-2659cda6effa-kube-api-access-4brjq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.394094 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f131ccb-5e9b-4097-8abe-f10d6f2c9b52-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.496099 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.496232 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.496277 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brjq\" (UniqueName: \"kubernetes.io/projected/a422f4f8-7b2e-4f73-89e8-2659cda6effa-kube-api-access-4brjq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.504534 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.508897 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.520735 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brjq\" (UniqueName: \"kubernetes.io/projected/a422f4f8-7b2e-4f73-89e8-2659cda6effa-kube-api-access-4brjq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-k47v2\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:46 crc kubenswrapper[4669]: I1001 11:50:46.561330 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:47 crc kubenswrapper[4669]: I1001 11:50:47.216970 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2"] Oct 01 11:50:47 crc kubenswrapper[4669]: W1001 11:50:47.230725 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda422f4f8_7b2e_4f73_89e8_2659cda6effa.slice/crio-36f1f81dec05ac79a4d4d20d7d3770f1cf652aba48ae68060956f96841d6262f WatchSource:0}: Error finding container 36f1f81dec05ac79a4d4d20d7d3770f1cf652aba48ae68060956f96841d6262f: Status 404 returned error can't find the container with id 36f1f81dec05ac79a4d4d20d7d3770f1cf652aba48ae68060956f96841d6262f Oct 01 11:50:48 crc kubenswrapper[4669]: I1001 11:50:48.157124 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" event={"ID":"a422f4f8-7b2e-4f73-89e8-2659cda6effa","Type":"ContainerStarted","Data":"5abf2555e00171244a980390cfacfd565b609c56d89c8ca02e7041c4d9060fe8"} Oct 01 11:50:48 crc kubenswrapper[4669]: I1001 11:50:48.157782 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" event={"ID":"a422f4f8-7b2e-4f73-89e8-2659cda6effa","Type":"ContainerStarted","Data":"36f1f81dec05ac79a4d4d20d7d3770f1cf652aba48ae68060956f96841d6262f"} Oct 01 11:50:48 crc kubenswrapper[4669]: I1001 11:50:48.178026 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" podStartSLOduration=1.7485166429999999 podStartE2EDuration="2.178004473s" podCreationTimestamp="2025-10-01 11:50:46 +0000 UTC" firstStartedPulling="2025-10-01 11:50:47.234680512 +0000 UTC m=+1338.334245489" lastFinishedPulling="2025-10-01 11:50:47.664168342 +0000 UTC m=+1338.763733319" observedRunningTime="2025-10-01 11:50:48.176180919 +0000 UTC m=+1339.275745946" watchObservedRunningTime="2025-10-01 11:50:48.178004473 +0000 UTC m=+1339.277569450" Oct 01 11:50:51 crc kubenswrapper[4669]: I1001 11:50:51.198175 4669 generic.go:334] "Generic (PLEG): container finished" podID="a422f4f8-7b2e-4f73-89e8-2659cda6effa" containerID="5abf2555e00171244a980390cfacfd565b609c56d89c8ca02e7041c4d9060fe8" exitCode=0 Oct 01 11:50:51 crc kubenswrapper[4669]: I1001 11:50:51.198316 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" event={"ID":"a422f4f8-7b2e-4f73-89e8-2659cda6effa","Type":"ContainerDied","Data":"5abf2555e00171244a980390cfacfd565b609c56d89c8ca02e7041c4d9060fe8"} Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.752765 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.879396 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-inventory\") pod \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.879864 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-ssh-key\") pod \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.879906 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4brjq\" (UniqueName: \"kubernetes.io/projected/a422f4f8-7b2e-4f73-89e8-2659cda6effa-kube-api-access-4brjq\") pod \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\" (UID: \"a422f4f8-7b2e-4f73-89e8-2659cda6effa\") " Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.888400 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a422f4f8-7b2e-4f73-89e8-2659cda6effa-kube-api-access-4brjq" (OuterVolumeSpecName: "kube-api-access-4brjq") pod "a422f4f8-7b2e-4f73-89e8-2659cda6effa" (UID: "a422f4f8-7b2e-4f73-89e8-2659cda6effa"). InnerVolumeSpecName "kube-api-access-4brjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.921359 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-inventory" (OuterVolumeSpecName: "inventory") pod "a422f4f8-7b2e-4f73-89e8-2659cda6effa" (UID: "a422f4f8-7b2e-4f73-89e8-2659cda6effa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.921372 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a422f4f8-7b2e-4f73-89e8-2659cda6effa" (UID: "a422f4f8-7b2e-4f73-89e8-2659cda6effa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.982810 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.982851 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4brjq\" (UniqueName: \"kubernetes.io/projected/a422f4f8-7b2e-4f73-89e8-2659cda6effa-kube-api-access-4brjq\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:52 crc kubenswrapper[4669]: I1001 11:50:52.982865 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a422f4f8-7b2e-4f73-89e8-2659cda6effa-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.229900 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" event={"ID":"a422f4f8-7b2e-4f73-89e8-2659cda6effa","Type":"ContainerDied","Data":"36f1f81dec05ac79a4d4d20d7d3770f1cf652aba48ae68060956f96841d6262f"} Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.229973 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36f1f81dec05ac79a4d4d20d7d3770f1cf652aba48ae68060956f96841d6262f" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.230052 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-k47v2" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.345735 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6"] Oct 01 11:50:53 crc kubenswrapper[4669]: E1001 11:50:53.347475 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a422f4f8-7b2e-4f73-89e8-2659cda6effa" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.347578 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a422f4f8-7b2e-4f73-89e8-2659cda6effa" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.347899 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a422f4f8-7b2e-4f73-89e8-2659cda6effa" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.349114 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.351662 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.352261 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.355677 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6"] Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.356138 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.356644 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.506631 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.506743 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.506795 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.506818 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmn9\" (UniqueName: \"kubernetes.io/projected/b905607b-b7ef-420f-8c4e-603d4c788186-kube-api-access-rgmn9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.609901 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.610832 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.611135 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.611240 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmn9\" (UniqueName: \"kubernetes.io/projected/b905607b-b7ef-420f-8c4e-603d4c788186-kube-api-access-rgmn9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.619769 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.621380 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.622662 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.635050 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmn9\" (UniqueName: \"kubernetes.io/projected/b905607b-b7ef-420f-8c4e-603d4c788186-kube-api-access-rgmn9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:53 crc kubenswrapper[4669]: I1001 11:50:53.679860 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:50:54 crc kubenswrapper[4669]: W1001 11:50:54.294330 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb905607b_b7ef_420f_8c4e_603d4c788186.slice/crio-665d8419ecc8c0ca675890a0c2b05d2e640341c0fa9ec40b9985d7369105e018 WatchSource:0}: Error finding container 665d8419ecc8c0ca675890a0c2b05d2e640341c0fa9ec40b9985d7369105e018: Status 404 returned error can't find the container with id 665d8419ecc8c0ca675890a0c2b05d2e640341c0fa9ec40b9985d7369105e018 Oct 01 11:50:54 crc kubenswrapper[4669]: I1001 11:50:54.301188 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6"] Oct 01 11:50:55 crc kubenswrapper[4669]: I1001 11:50:55.260556 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" event={"ID":"b905607b-b7ef-420f-8c4e-603d4c788186","Type":"ContainerStarted","Data":"259a46ca9656b4105ec7b7f4186ea890927e5e3b9edf12a9a8dd5a1718c2daa7"} Oct 01 11:50:55 crc kubenswrapper[4669]: I1001 11:50:55.261249 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" event={"ID":"b905607b-b7ef-420f-8c4e-603d4c788186","Type":"ContainerStarted","Data":"665d8419ecc8c0ca675890a0c2b05d2e640341c0fa9ec40b9985d7369105e018"} Oct 01 11:50:55 crc kubenswrapper[4669]: I1001 11:50:55.298320 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" podStartSLOduration=1.875131557 podStartE2EDuration="2.298279591s" podCreationTimestamp="2025-10-01 11:50:53 +0000 UTC" firstStartedPulling="2025-10-01 11:50:54.297833067 +0000 UTC m=+1345.397398034" lastFinishedPulling="2025-10-01 11:50:54.720981061 +0000 UTC m=+1345.820546068" observedRunningTime="2025-10-01 11:50:55.286731397 +0000 UTC m=+1346.386296424" watchObservedRunningTime="2025-10-01 11:50:55.298279591 +0000 UTC m=+1346.397844608" Oct 01 11:51:32 crc kubenswrapper[4669]: I1001 11:51:32.820817 4669 scope.go:117] "RemoveContainer" containerID="a6ba5796a6d3416538a0af7dc289a2bc98f52a02d5a2bc45099a7dc04de54970" Oct 01 11:51:32 crc kubenswrapper[4669]: I1001 11:51:32.853391 4669 scope.go:117] "RemoveContainer" containerID="22fb6b1698440678931072c89b6015d5277af68319ec2fcee3ebf277a93d1272" Oct 01 11:51:32 crc kubenswrapper[4669]: I1001 11:51:32.881440 4669 scope.go:117] "RemoveContainer" containerID="369c0c5fd26b948f45b9ba644bad9af4138341597e162def31a60aa135c2634d" Oct 01 11:51:32 crc kubenswrapper[4669]: I1001 11:51:32.952149 4669 scope.go:117] "RemoveContainer" containerID="4a0c94e531789cbc7fdb5d291d83f148863fe78b4310a3f2bac4b6c6abb91062" Oct 01 11:52:01 crc kubenswrapper[4669]: I1001 11:52:01.863702 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:52:01 crc kubenswrapper[4669]: I1001 11:52:01.864483 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:52:31 crc kubenswrapper[4669]: I1001 11:52:31.863386 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:52:31 crc kubenswrapper[4669]: I1001 11:52:31.865958 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:52:33 crc kubenswrapper[4669]: I1001 11:52:33.107899 4669 scope.go:117] "RemoveContainer" containerID="cf9ebe8ddf8119487d1976169a5e06441f5e00cce2b3cb03d7880e9fda69925c" Oct 01 11:52:33 crc kubenswrapper[4669]: I1001 11:52:33.175631 4669 scope.go:117] "RemoveContainer" containerID="fe4bfdafde10c16962a1f295de5f5ead419963d2b6c9c7ec5abc8f8f14f08c61" Oct 01 11:53:01 crc kubenswrapper[4669]: I1001 11:53:01.864315 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:53:01 crc kubenswrapper[4669]: I1001 11:53:01.865748 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:53:01 crc kubenswrapper[4669]: I1001 11:53:01.865884 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:53:01 crc kubenswrapper[4669]: I1001 11:53:01.867715 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b1236276e91901ca356b23317942bcba8b16d3a037aab0000d2acb95db6570b"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:53:01 crc kubenswrapper[4669]: I1001 11:53:01.868059 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://7b1236276e91901ca356b23317942bcba8b16d3a037aab0000d2acb95db6570b" gracePeriod=600 Oct 01 11:53:02 crc kubenswrapper[4669]: I1001 11:53:02.933798 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="7b1236276e91901ca356b23317942bcba8b16d3a037aab0000d2acb95db6570b" exitCode=0 Oct 01 11:53:02 crc kubenswrapper[4669]: I1001 11:53:02.933865 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"7b1236276e91901ca356b23317942bcba8b16d3a037aab0000d2acb95db6570b"} Oct 01 11:53:02 crc kubenswrapper[4669]: I1001 11:53:02.934676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55"} Oct 01 11:53:02 crc kubenswrapper[4669]: I1001 11:53:02.934713 4669 scope.go:117] "RemoveContainer" containerID="7e52cf47b1ea2351c50bcd89b78dca4005cb050fc916aa94ef178ab99a189cf3" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.355994 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxfnp"] Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.359588 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.371916 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxfnp"] Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.411818 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tq9g\" (UniqueName: \"kubernetes.io/projected/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-kube-api-access-8tq9g\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.411953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-utilities\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.411982 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-catalog-content\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.514485 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tq9g\" (UniqueName: \"kubernetes.io/projected/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-kube-api-access-8tq9g\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.514597 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-utilities\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.514619 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-catalog-content\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.515097 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-catalog-content\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.515869 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-utilities\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.549568 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tq9g\" (UniqueName: \"kubernetes.io/projected/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-kube-api-access-8tq9g\") pod \"certified-operators-xxfnp\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:25 crc kubenswrapper[4669]: I1001 11:53:25.684801 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:26 crc kubenswrapper[4669]: I1001 11:53:26.143359 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxfnp"] Oct 01 11:53:26 crc kubenswrapper[4669]: I1001 11:53:26.240327 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerStarted","Data":"1da0b6bf621cdf0c89e6ee799a800edccbcd9788353f1e0f7ed0a4c78a0010c2"} Oct 01 11:53:27 crc kubenswrapper[4669]: I1001 11:53:27.256129 4669 generic.go:334] "Generic (PLEG): container finished" podID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerID="ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e" exitCode=0 Oct 01 11:53:27 crc kubenswrapper[4669]: I1001 11:53:27.256607 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerDied","Data":"ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e"} Oct 01 11:53:28 crc kubenswrapper[4669]: I1001 11:53:28.269246 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerStarted","Data":"05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd"} Oct 01 11:53:29 crc kubenswrapper[4669]: I1001 11:53:29.282563 4669 generic.go:334] "Generic (PLEG): container finished" podID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerID="05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd" exitCode=0 Oct 01 11:53:29 crc kubenswrapper[4669]: I1001 11:53:29.282632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerDied","Data":"05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd"} Oct 01 11:53:30 crc kubenswrapper[4669]: I1001 11:53:30.296936 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerStarted","Data":"86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e"} Oct 01 11:53:30 crc kubenswrapper[4669]: I1001 11:53:30.335650 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxfnp" podStartSLOduration=2.875703052 podStartE2EDuration="5.335627088s" podCreationTimestamp="2025-10-01 11:53:25 +0000 UTC" firstStartedPulling="2025-10-01 11:53:27.259507945 +0000 UTC m=+1498.359072922" lastFinishedPulling="2025-10-01 11:53:29.719431941 +0000 UTC m=+1500.818996958" observedRunningTime="2025-10-01 11:53:30.326542512 +0000 UTC m=+1501.426107489" watchObservedRunningTime="2025-10-01 11:53:30.335627088 +0000 UTC m=+1501.435192065" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.724207 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4jwn9"] Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.729432 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.763658 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jwn9"] Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.802982 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-catalog-content\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.803133 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-utilities\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.803279 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrg4z\" (UniqueName: \"kubernetes.io/projected/75f7114a-2123-49f7-9433-70f2ebcccddc-kube-api-access-xrg4z\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.906025 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-catalog-content\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.906140 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-utilities\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.906245 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrg4z\" (UniqueName: \"kubernetes.io/projected/75f7114a-2123-49f7-9433-70f2ebcccddc-kube-api-access-xrg4z\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.906716 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-catalog-content\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.906919 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-utilities\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:32 crc kubenswrapper[4669]: I1001 11:53:32.938270 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrg4z\" (UniqueName: \"kubernetes.io/projected/75f7114a-2123-49f7-9433-70f2ebcccddc-kube-api-access-xrg4z\") pod \"redhat-marketplace-4jwn9\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:33 crc kubenswrapper[4669]: I1001 11:53:33.064378 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:33 crc kubenswrapper[4669]: I1001 11:53:33.249592 4669 scope.go:117] "RemoveContainer" containerID="3f2b1a344e4cd849a66e7f8f80ab83bf8881c32632fe6c57cbe158548e0aeaf4" Oct 01 11:53:33 crc kubenswrapper[4669]: I1001 11:53:33.369283 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jwn9"] Oct 01 11:53:34 crc kubenswrapper[4669]: I1001 11:53:34.357581 4669 generic.go:334] "Generic (PLEG): container finished" podID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerID="f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9" exitCode=0 Oct 01 11:53:34 crc kubenswrapper[4669]: I1001 11:53:34.357735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jwn9" event={"ID":"75f7114a-2123-49f7-9433-70f2ebcccddc","Type":"ContainerDied","Data":"f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9"} Oct 01 11:53:34 crc kubenswrapper[4669]: I1001 11:53:34.357956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jwn9" event={"ID":"75f7114a-2123-49f7-9433-70f2ebcccddc","Type":"ContainerStarted","Data":"87e978555418f12eda58f8693a17b232a22e22b04ecd741ddc526d0834165920"} Oct 01 11:53:35 crc kubenswrapper[4669]: I1001 11:53:35.685597 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:35 crc kubenswrapper[4669]: I1001 11:53:35.686952 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:35 crc kubenswrapper[4669]: I1001 11:53:35.752240 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:36 crc kubenswrapper[4669]: I1001 11:53:36.387311 4669 generic.go:334] "Generic (PLEG): container finished" podID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerID="7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0" exitCode=0 Oct 01 11:53:36 crc kubenswrapper[4669]: I1001 11:53:36.387406 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jwn9" event={"ID":"75f7114a-2123-49f7-9433-70f2ebcccddc","Type":"ContainerDied","Data":"7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0"} Oct 01 11:53:36 crc kubenswrapper[4669]: I1001 11:53:36.471803 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:37 crc kubenswrapper[4669]: I1001 11:53:37.312714 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxfnp"] Oct 01 11:53:37 crc kubenswrapper[4669]: I1001 11:53:37.402691 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jwn9" event={"ID":"75f7114a-2123-49f7-9433-70f2ebcccddc","Type":"ContainerStarted","Data":"07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3"} Oct 01 11:53:37 crc kubenswrapper[4669]: I1001 11:53:37.433804 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4jwn9" podStartSLOduration=2.934155267 podStartE2EDuration="5.433776943s" podCreationTimestamp="2025-10-01 11:53:32 +0000 UTC" firstStartedPulling="2025-10-01 11:53:34.362422979 +0000 UTC m=+1505.461987956" lastFinishedPulling="2025-10-01 11:53:36.862044635 +0000 UTC m=+1507.961609632" observedRunningTime="2025-10-01 11:53:37.423415714 +0000 UTC m=+1508.522980721" watchObservedRunningTime="2025-10-01 11:53:37.433776943 +0000 UTC m=+1508.533341930" Oct 01 11:53:38 crc kubenswrapper[4669]: I1001 11:53:38.415573 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xxfnp" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="registry-server" containerID="cri-o://86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e" gracePeriod=2 Oct 01 11:53:38 crc kubenswrapper[4669]: I1001 11:53:38.985643 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.077019 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tq9g\" (UniqueName: \"kubernetes.io/projected/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-kube-api-access-8tq9g\") pod \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.078427 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-utilities\") pod \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.078650 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-catalog-content\") pod \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\" (UID: \"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d\") " Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.080177 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-utilities" (OuterVolumeSpecName: "utilities") pod "ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" (UID: "ab67f15d-08b6-4c56-b44b-ba24a3cbc31d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.089782 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-kube-api-access-8tq9g" (OuterVolumeSpecName: "kube-api-access-8tq9g") pod "ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" (UID: "ab67f15d-08b6-4c56-b44b-ba24a3cbc31d"). InnerVolumeSpecName "kube-api-access-8tq9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.152334 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" (UID: "ab67f15d-08b6-4c56-b44b-ba24a3cbc31d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.185013 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tq9g\" (UniqueName: \"kubernetes.io/projected/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-kube-api-access-8tq9g\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.185122 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.185149 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.432735 4669 generic.go:334] "Generic (PLEG): container finished" podID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerID="86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e" exitCode=0 Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.432799 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerDied","Data":"86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e"} Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.432846 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfnp" event={"ID":"ab67f15d-08b6-4c56-b44b-ba24a3cbc31d","Type":"ContainerDied","Data":"1da0b6bf621cdf0c89e6ee799a800edccbcd9788353f1e0f7ed0a4c78a0010c2"} Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.432861 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfnp" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.432889 4669 scope.go:117] "RemoveContainer" containerID="86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.467229 4669 scope.go:117] "RemoveContainer" containerID="05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.482607 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxfnp"] Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.492016 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xxfnp"] Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.518852 4669 scope.go:117] "RemoveContainer" containerID="ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.548697 4669 scope.go:117] "RemoveContainer" containerID="86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e" Oct 01 11:53:39 crc kubenswrapper[4669]: E1001 11:53:39.549308 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e\": container with ID starting with 86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e not found: ID does not exist" containerID="86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.549349 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e"} err="failed to get container status \"86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e\": rpc error: code = NotFound desc = could not find container \"86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e\": container with ID starting with 86d60bf010ef6dbcb568db80ce5f714adca3b37c0c6123de8fe6b96a4643bc3e not found: ID does not exist" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.549375 4669 scope.go:117] "RemoveContainer" containerID="05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd" Oct 01 11:53:39 crc kubenswrapper[4669]: E1001 11:53:39.549686 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd\": container with ID starting with 05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd not found: ID does not exist" containerID="05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.549711 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd"} err="failed to get container status \"05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd\": rpc error: code = NotFound desc = could not find container \"05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd\": container with ID starting with 05a23def1505f1dd489e1d5e6800e84154e065a7f0d8e0e7227eeb04bdf01dfd not found: ID does not exist" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.549727 4669 scope.go:117] "RemoveContainer" containerID="ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e" Oct 01 11:53:39 crc kubenswrapper[4669]: E1001 11:53:39.550091 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e\": container with ID starting with ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e not found: ID does not exist" containerID="ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.550153 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e"} err="failed to get container status \"ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e\": rpc error: code = NotFound desc = could not find container \"ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e\": container with ID starting with ebf716234abaf9c7045b63c9c03d9c216f35767044071422c4a27abd51d1001e not found: ID does not exist" Oct 01 11:53:39 crc kubenswrapper[4669]: I1001 11:53:39.678253 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" path="/var/lib/kubelet/pods/ab67f15d-08b6-4c56-b44b-ba24a3cbc31d/volumes" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.065321 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.066521 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.127206 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qbssd"] Oct 01 11:53:43 crc kubenswrapper[4669]: E1001 11:53:43.128189 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="extract-content" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.128218 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="extract-content" Oct 01 11:53:43 crc kubenswrapper[4669]: E1001 11:53:43.128248 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="registry-server" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.128255 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="registry-server" Oct 01 11:53:43 crc kubenswrapper[4669]: E1001 11:53:43.128318 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="extract-utilities" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.128329 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="extract-utilities" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.128566 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab67f15d-08b6-4c56-b44b-ba24a3cbc31d" containerName="registry-server" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.130895 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.143777 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.159166 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbssd"] Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.285704 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-catalog-content\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.286045 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m76gf\" (UniqueName: \"kubernetes.io/projected/48f05668-ed5f-4ea7-85d8-9b8ad6777948-kube-api-access-m76gf\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.286431 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-utilities\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.389105 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-catalog-content\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.389208 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m76gf\" (UniqueName: \"kubernetes.io/projected/48f05668-ed5f-4ea7-85d8-9b8ad6777948-kube-api-access-m76gf\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.389289 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-utilities\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.389965 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-catalog-content\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.389977 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-utilities\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.415316 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m76gf\" (UniqueName: \"kubernetes.io/projected/48f05668-ed5f-4ea7-85d8-9b8ad6777948-kube-api-access-m76gf\") pod \"community-operators-qbssd\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.497803 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:43 crc kubenswrapper[4669]: I1001 11:53:43.583021 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:44 crc kubenswrapper[4669]: I1001 11:53:44.082377 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbssd"] Oct 01 11:53:44 crc kubenswrapper[4669]: I1001 11:53:44.501590 4669 generic.go:334] "Generic (PLEG): container finished" podID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerID="01b721d8d539721d82aa0bd6b17be4377c69f05b958b72193e21244186b513bd" exitCode=0 Oct 01 11:53:44 crc kubenswrapper[4669]: I1001 11:53:44.501672 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbssd" event={"ID":"48f05668-ed5f-4ea7-85d8-9b8ad6777948","Type":"ContainerDied","Data":"01b721d8d539721d82aa0bd6b17be4377c69f05b958b72193e21244186b513bd"} Oct 01 11:53:44 crc kubenswrapper[4669]: I1001 11:53:44.502132 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbssd" event={"ID":"48f05668-ed5f-4ea7-85d8-9b8ad6777948","Type":"ContainerStarted","Data":"8afa2a69bd993866210b7a3eadb56a092a68588457adda2e1c91c9f4d30eca67"} Oct 01 11:53:45 crc kubenswrapper[4669]: I1001 11:53:45.517904 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jwn9"] Oct 01 11:53:45 crc kubenswrapper[4669]: I1001 11:53:45.518710 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4jwn9" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="registry-server" containerID="cri-o://07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3" gracePeriod=2 Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.079002 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.156636 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-utilities\") pod \"75f7114a-2123-49f7-9433-70f2ebcccddc\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.156744 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrg4z\" (UniqueName: \"kubernetes.io/projected/75f7114a-2123-49f7-9433-70f2ebcccddc-kube-api-access-xrg4z\") pod \"75f7114a-2123-49f7-9433-70f2ebcccddc\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.156881 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-catalog-content\") pod \"75f7114a-2123-49f7-9433-70f2ebcccddc\" (UID: \"75f7114a-2123-49f7-9433-70f2ebcccddc\") " Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.158566 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-utilities" (OuterVolumeSpecName: "utilities") pod "75f7114a-2123-49f7-9433-70f2ebcccddc" (UID: "75f7114a-2123-49f7-9433-70f2ebcccddc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.167431 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f7114a-2123-49f7-9433-70f2ebcccddc-kube-api-access-xrg4z" (OuterVolumeSpecName: "kube-api-access-xrg4z") pod "75f7114a-2123-49f7-9433-70f2ebcccddc" (UID: "75f7114a-2123-49f7-9433-70f2ebcccddc"). InnerVolumeSpecName "kube-api-access-xrg4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.180537 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75f7114a-2123-49f7-9433-70f2ebcccddc" (UID: "75f7114a-2123-49f7-9433-70f2ebcccddc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.259708 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.259751 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrg4z\" (UniqueName: \"kubernetes.io/projected/75f7114a-2123-49f7-9433-70f2ebcccddc-kube-api-access-xrg4z\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.259766 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f7114a-2123-49f7-9433-70f2ebcccddc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.525104 4669 generic.go:334] "Generic (PLEG): container finished" podID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerID="e271d0df5ec2a80af27a8ecea54d84bffe108633b9b138badc58c2b988af0f10" exitCode=0 Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.525198 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbssd" event={"ID":"48f05668-ed5f-4ea7-85d8-9b8ad6777948","Type":"ContainerDied","Data":"e271d0df5ec2a80af27a8ecea54d84bffe108633b9b138badc58c2b988af0f10"} Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.530041 4669 generic.go:334] "Generic (PLEG): container finished" podID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerID="07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3" exitCode=0 Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.530090 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jwn9" event={"ID":"75f7114a-2123-49f7-9433-70f2ebcccddc","Type":"ContainerDied","Data":"07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3"} Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.530113 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jwn9" event={"ID":"75f7114a-2123-49f7-9433-70f2ebcccddc","Type":"ContainerDied","Data":"87e978555418f12eda58f8693a17b232a22e22b04ecd741ddc526d0834165920"} Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.530135 4669 scope.go:117] "RemoveContainer" containerID="07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.530168 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jwn9" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.577346 4669 scope.go:117] "RemoveContainer" containerID="7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.580809 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jwn9"] Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.590205 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jwn9"] Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.601884 4669 scope.go:117] "RemoveContainer" containerID="f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.680113 4669 scope.go:117] "RemoveContainer" containerID="07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3" Oct 01 11:53:46 crc kubenswrapper[4669]: E1001 11:53:46.680762 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3\": container with ID starting with 07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3 not found: ID does not exist" containerID="07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.680814 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3"} err="failed to get container status \"07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3\": rpc error: code = NotFound desc = could not find container \"07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3\": container with ID starting with 07f0fe9dcc85824a9e4194ed0e61ca0851649bfc5ceb02aff812c572cfd9baf3 not found: ID does not exist" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.680855 4669 scope.go:117] "RemoveContainer" containerID="7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0" Oct 01 11:53:46 crc kubenswrapper[4669]: E1001 11:53:46.684453 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0\": container with ID starting with 7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0 not found: ID does not exist" containerID="7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.684552 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0"} err="failed to get container status \"7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0\": rpc error: code = NotFound desc = could not find container \"7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0\": container with ID starting with 7b5b970ffc6048aaa1042e9edf86fdb3ad5c2776382cbef26c1601ae06bce1d0 not found: ID does not exist" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.684610 4669 scope.go:117] "RemoveContainer" containerID="f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9" Oct 01 11:53:46 crc kubenswrapper[4669]: E1001 11:53:46.685267 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9\": container with ID starting with f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9 not found: ID does not exist" containerID="f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9" Oct 01 11:53:46 crc kubenswrapper[4669]: I1001 11:53:46.685338 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9"} err="failed to get container status \"f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9\": rpc error: code = NotFound desc = could not find container \"f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9\": container with ID starting with f78927fd4de0010248d5a3fae81041e37e7c5a3d6a37ca2e7da8cd3103c9efb9 not found: ID does not exist" Oct 01 11:53:47 crc kubenswrapper[4669]: I1001 11:53:47.546558 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbssd" event={"ID":"48f05668-ed5f-4ea7-85d8-9b8ad6777948","Type":"ContainerStarted","Data":"a2f60114ec41fdb01b3357e0512a02a1c2c9e385743036f267c90540359914d3"} Oct 01 11:53:47 crc kubenswrapper[4669]: I1001 11:53:47.572865 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qbssd" podStartSLOduration=2.091996656 podStartE2EDuration="4.572839134s" podCreationTimestamp="2025-10-01 11:53:43 +0000 UTC" firstStartedPulling="2025-10-01 11:53:44.504054003 +0000 UTC m=+1515.603618980" lastFinishedPulling="2025-10-01 11:53:46.984896481 +0000 UTC m=+1518.084461458" observedRunningTime="2025-10-01 11:53:47.563883961 +0000 UTC m=+1518.663448958" watchObservedRunningTime="2025-10-01 11:53:47.572839134 +0000 UTC m=+1518.672404111" Oct 01 11:53:47 crc kubenswrapper[4669]: I1001 11:53:47.657830 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" path="/var/lib/kubelet/pods/75f7114a-2123-49f7-9433-70f2ebcccddc/volumes" Oct 01 11:53:53 crc kubenswrapper[4669]: I1001 11:53:53.498325 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:53 crc kubenswrapper[4669]: I1001 11:53:53.498971 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:53 crc kubenswrapper[4669]: I1001 11:53:53.581247 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:53 crc kubenswrapper[4669]: I1001 11:53:53.669102 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:53 crc kubenswrapper[4669]: I1001 11:53:53.843619 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbssd"] Oct 01 11:53:55 crc kubenswrapper[4669]: I1001 11:53:55.652672 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qbssd" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="registry-server" containerID="cri-o://a2f60114ec41fdb01b3357e0512a02a1c2c9e385743036f267c90540359914d3" gracePeriod=2 Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.682790 4669 generic.go:334] "Generic (PLEG): container finished" podID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerID="a2f60114ec41fdb01b3357e0512a02a1c2c9e385743036f267c90540359914d3" exitCode=0 Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.683360 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbssd" event={"ID":"48f05668-ed5f-4ea7-85d8-9b8ad6777948","Type":"ContainerDied","Data":"a2f60114ec41fdb01b3357e0512a02a1c2c9e385743036f267c90540359914d3"} Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.683834 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbssd" event={"ID":"48f05668-ed5f-4ea7-85d8-9b8ad6777948","Type":"ContainerDied","Data":"8afa2a69bd993866210b7a3eadb56a092a68588457adda2e1c91c9f4d30eca67"} Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.683870 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8afa2a69bd993866210b7a3eadb56a092a68588457adda2e1c91c9f4d30eca67" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.726877 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.820289 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m76gf\" (UniqueName: \"kubernetes.io/projected/48f05668-ed5f-4ea7-85d8-9b8ad6777948-kube-api-access-m76gf\") pod \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.820382 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-utilities\") pod \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.821041 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-catalog-content\") pod \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\" (UID: \"48f05668-ed5f-4ea7-85d8-9b8ad6777948\") " Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.822528 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-utilities" (OuterVolumeSpecName: "utilities") pod "48f05668-ed5f-4ea7-85d8-9b8ad6777948" (UID: "48f05668-ed5f-4ea7-85d8-9b8ad6777948"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.830767 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.841637 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f05668-ed5f-4ea7-85d8-9b8ad6777948-kube-api-access-m76gf" (OuterVolumeSpecName: "kube-api-access-m76gf") pod "48f05668-ed5f-4ea7-85d8-9b8ad6777948" (UID: "48f05668-ed5f-4ea7-85d8-9b8ad6777948"). InnerVolumeSpecName "kube-api-access-m76gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.898218 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48f05668-ed5f-4ea7-85d8-9b8ad6777948" (UID: "48f05668-ed5f-4ea7-85d8-9b8ad6777948"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.932604 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f05668-ed5f-4ea7-85d8-9b8ad6777948-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:56 crc kubenswrapper[4669]: I1001 11:53:56.932650 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m76gf\" (UniqueName: \"kubernetes.io/projected/48f05668-ed5f-4ea7-85d8-9b8ad6777948-kube-api-access-m76gf\") on node \"crc\" DevicePath \"\"" Oct 01 11:53:57 crc kubenswrapper[4669]: I1001 11:53:57.699830 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbssd" Oct 01 11:53:57 crc kubenswrapper[4669]: I1001 11:53:57.744852 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbssd"] Oct 01 11:53:57 crc kubenswrapper[4669]: I1001 11:53:57.755781 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qbssd"] Oct 01 11:53:59 crc kubenswrapper[4669]: I1001 11:53:59.662606 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" path="/var/lib/kubelet/pods/48f05668-ed5f-4ea7-85d8-9b8ad6777948/volumes" Oct 01 11:54:02 crc kubenswrapper[4669]: I1001 11:54:02.785560 4669 generic.go:334] "Generic (PLEG): container finished" podID="b905607b-b7ef-420f-8c4e-603d4c788186" containerID="259a46ca9656b4105ec7b7f4186ea890927e5e3b9edf12a9a8dd5a1718c2daa7" exitCode=0 Oct 01 11:54:02 crc kubenswrapper[4669]: I1001 11:54:02.785708 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" event={"ID":"b905607b-b7ef-420f-8c4e-603d4c788186","Type":"ContainerDied","Data":"259a46ca9656b4105ec7b7f4186ea890927e5e3b9edf12a9a8dd5a1718c2daa7"} Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.392174 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.431443 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmn9\" (UniqueName: \"kubernetes.io/projected/b905607b-b7ef-420f-8c4e-603d4c788186-kube-api-access-rgmn9\") pod \"b905607b-b7ef-420f-8c4e-603d4c788186\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.431519 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-bootstrap-combined-ca-bundle\") pod \"b905607b-b7ef-420f-8c4e-603d4c788186\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.431614 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-ssh-key\") pod \"b905607b-b7ef-420f-8c4e-603d4c788186\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.431723 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-inventory\") pod \"b905607b-b7ef-420f-8c4e-603d4c788186\" (UID: \"b905607b-b7ef-420f-8c4e-603d4c788186\") " Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.440118 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b905607b-b7ef-420f-8c4e-603d4c788186" (UID: "b905607b-b7ef-420f-8c4e-603d4c788186"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.442484 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b905607b-b7ef-420f-8c4e-603d4c788186-kube-api-access-rgmn9" (OuterVolumeSpecName: "kube-api-access-rgmn9") pod "b905607b-b7ef-420f-8c4e-603d4c788186" (UID: "b905607b-b7ef-420f-8c4e-603d4c788186"). InnerVolumeSpecName "kube-api-access-rgmn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.476351 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b905607b-b7ef-420f-8c4e-603d4c788186" (UID: "b905607b-b7ef-420f-8c4e-603d4c788186"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.482965 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-inventory" (OuterVolumeSpecName: "inventory") pod "b905607b-b7ef-420f-8c4e-603d4c788186" (UID: "b905607b-b7ef-420f-8c4e-603d4c788186"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.534342 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.534377 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.534388 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmn9\" (UniqueName: \"kubernetes.io/projected/b905607b-b7ef-420f-8c4e-603d4c788186-kube-api-access-rgmn9\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.534403 4669 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b905607b-b7ef-420f-8c4e-603d4c788186-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.788414 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hgsq8"] Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.788992 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b905607b-b7ef-420f-8c4e-603d4c788186" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789020 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b905607b-b7ef-420f-8c4e-603d4c788186" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.789059 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="registry-server" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789069 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="registry-server" Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.789115 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="registry-server" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789124 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="registry-server" Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.789146 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="extract-utilities" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789157 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="extract-utilities" Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.789185 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="extract-content" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789192 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="extract-content" Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.789200 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="extract-content" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789207 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="extract-content" Oct 01 11:54:04 crc kubenswrapper[4669]: E1001 11:54:04.789225 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="extract-utilities" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789234 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="extract-utilities" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789422 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f7114a-2123-49f7-9433-70f2ebcccddc" containerName="registry-server" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789456 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f05668-ed5f-4ea7-85d8-9b8ad6777948" containerName="registry-server" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.789464 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b905607b-b7ef-420f-8c4e-603d4c788186" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.790838 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.819108 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgsq8"] Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.860052 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-utilities\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.860154 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bz5c\" (UniqueName: \"kubernetes.io/projected/b474be11-cbe7-4c80-a666-603d792dde90-kube-api-access-4bz5c\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.860905 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-catalog-content\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.880722 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" event={"ID":"b905607b-b7ef-420f-8c4e-603d4c788186","Type":"ContainerDied","Data":"665d8419ecc8c0ca675890a0c2b05d2e640341c0fa9ec40b9985d7369105e018"} Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.880788 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="665d8419ecc8c0ca675890a0c2b05d2e640341c0fa9ec40b9985d7369105e018" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.880891 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.925096 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82"] Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.926758 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.932623 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.932666 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.933010 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.933395 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.945500 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82"] Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.964962 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-catalog-content\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965059 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-utilities\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965100 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bz5c\" (UniqueName: \"kubernetes.io/projected/b474be11-cbe7-4c80-a666-603d792dde90-kube-api-access-4bz5c\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965144 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965188 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87czz\" (UniqueName: \"kubernetes.io/projected/261f1c48-3c07-495d-b916-861c2a1943d8-kube-api-access-87czz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965243 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965808 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-catalog-content\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.965821 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-utilities\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:04 crc kubenswrapper[4669]: I1001 11:54:04.990240 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bz5c\" (UniqueName: \"kubernetes.io/projected/b474be11-cbe7-4c80-a666-603d792dde90-kube-api-access-4bz5c\") pod \"redhat-operators-hgsq8\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.067567 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.067736 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.067781 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87czz\" (UniqueName: \"kubernetes.io/projected/261f1c48-3c07-495d-b916-861c2a1943d8-kube-api-access-87czz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.073799 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.075922 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.090227 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87czz\" (UniqueName: \"kubernetes.io/projected/261f1c48-3c07-495d-b916-861c2a1943d8-kube-api-access-87czz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rvw82\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.150152 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.262706 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.692985 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgsq8"] Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.895990 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerStarted","Data":"9f30f2b78e5d8e9cc01be9ecaeed2640aa907657c838cd195fc7b422971ffaab"} Oct 01 11:54:05 crc kubenswrapper[4669]: I1001 11:54:05.922659 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82"] Oct 01 11:54:06 crc kubenswrapper[4669]: I1001 11:54:06.911574 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" event={"ID":"261f1c48-3c07-495d-b916-861c2a1943d8","Type":"ContainerStarted","Data":"925514b17efcda5c28965e0f6423f45b6872ed4dd95bd1981ec21f4888c90264"} Oct 01 11:54:06 crc kubenswrapper[4669]: I1001 11:54:06.912470 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" event={"ID":"261f1c48-3c07-495d-b916-861c2a1943d8","Type":"ContainerStarted","Data":"560c9dc2237377b3750e5941fa48dba879a5b47f15012cdc9985e10fd38d06c9"} Oct 01 11:54:06 crc kubenswrapper[4669]: I1001 11:54:06.916143 4669 generic.go:334] "Generic (PLEG): container finished" podID="b474be11-cbe7-4c80-a666-603d792dde90" containerID="5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed" exitCode=0 Oct 01 11:54:06 crc kubenswrapper[4669]: I1001 11:54:06.916216 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerDied","Data":"5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed"} Oct 01 11:54:06 crc kubenswrapper[4669]: I1001 11:54:06.951370 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" podStartSLOduration=2.329216356 podStartE2EDuration="2.95134348s" podCreationTimestamp="2025-10-01 11:54:04 +0000 UTC" firstStartedPulling="2025-10-01 11:54:05.934749738 +0000 UTC m=+1537.034314715" lastFinishedPulling="2025-10-01 11:54:06.556876862 +0000 UTC m=+1537.656441839" observedRunningTime="2025-10-01 11:54:06.933044673 +0000 UTC m=+1538.032609660" watchObservedRunningTime="2025-10-01 11:54:06.95134348 +0000 UTC m=+1538.050908457" Oct 01 11:54:08 crc kubenswrapper[4669]: I1001 11:54:08.942924 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerStarted","Data":"ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67"} Oct 01 11:54:09 crc kubenswrapper[4669]: I1001 11:54:09.963576 4669 generic.go:334] "Generic (PLEG): container finished" podID="b474be11-cbe7-4c80-a666-603d792dde90" containerID="ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67" exitCode=0 Oct 01 11:54:09 crc kubenswrapper[4669]: I1001 11:54:09.963732 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerDied","Data":"ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67"} Oct 01 11:54:11 crc kubenswrapper[4669]: I1001 11:54:11.989375 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerStarted","Data":"a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3"} Oct 01 11:54:12 crc kubenswrapper[4669]: I1001 11:54:12.026340 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hgsq8" podStartSLOduration=4.200031069 podStartE2EDuration="8.02630502s" podCreationTimestamp="2025-10-01 11:54:04 +0000 UTC" firstStartedPulling="2025-10-01 11:54:06.918243384 +0000 UTC m=+1538.017808381" lastFinishedPulling="2025-10-01 11:54:10.744517345 +0000 UTC m=+1541.844082332" observedRunningTime="2025-10-01 11:54:12.008521077 +0000 UTC m=+1543.108086094" watchObservedRunningTime="2025-10-01 11:54:12.02630502 +0000 UTC m=+1543.125870037" Oct 01 11:54:15 crc kubenswrapper[4669]: I1001 11:54:15.151008 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:15 crc kubenswrapper[4669]: I1001 11:54:15.151975 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:16 crc kubenswrapper[4669]: I1001 11:54:16.226579 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hgsq8" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="registry-server" probeResult="failure" output=< Oct 01 11:54:16 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 11:54:16 crc kubenswrapper[4669]: > Oct 01 11:54:25 crc kubenswrapper[4669]: I1001 11:54:25.231617 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:25 crc kubenswrapper[4669]: I1001 11:54:25.321523 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:25 crc kubenswrapper[4669]: I1001 11:54:25.481659 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgsq8"] Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.182247 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hgsq8" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="registry-server" containerID="cri-o://a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3" gracePeriod=2 Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.773298 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.908374 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bz5c\" (UniqueName: \"kubernetes.io/projected/b474be11-cbe7-4c80-a666-603d792dde90-kube-api-access-4bz5c\") pod \"b474be11-cbe7-4c80-a666-603d792dde90\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.908806 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-catalog-content\") pod \"b474be11-cbe7-4c80-a666-603d792dde90\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.908884 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-utilities\") pod \"b474be11-cbe7-4c80-a666-603d792dde90\" (UID: \"b474be11-cbe7-4c80-a666-603d792dde90\") " Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.909778 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-utilities" (OuterVolumeSpecName: "utilities") pod "b474be11-cbe7-4c80-a666-603d792dde90" (UID: "b474be11-cbe7-4c80-a666-603d792dde90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:54:27 crc kubenswrapper[4669]: I1001 11:54:27.915445 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b474be11-cbe7-4c80-a666-603d792dde90-kube-api-access-4bz5c" (OuterVolumeSpecName: "kube-api-access-4bz5c") pod "b474be11-cbe7-4c80-a666-603d792dde90" (UID: "b474be11-cbe7-4c80-a666-603d792dde90"). InnerVolumeSpecName "kube-api-access-4bz5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.011290 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b474be11-cbe7-4c80-a666-603d792dde90" (UID: "b474be11-cbe7-4c80-a666-603d792dde90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.011890 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.011921 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474be11-cbe7-4c80-a666-603d792dde90-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.011932 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bz5c\" (UniqueName: \"kubernetes.io/projected/b474be11-cbe7-4c80-a666-603d792dde90-kube-api-access-4bz5c\") on node \"crc\" DevicePath \"\"" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.200899 4669 generic.go:334] "Generic (PLEG): container finished" podID="b474be11-cbe7-4c80-a666-603d792dde90" containerID="a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3" exitCode=0 Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.200971 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerDied","Data":"a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3"} Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.201020 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgsq8" event={"ID":"b474be11-cbe7-4c80-a666-603d792dde90","Type":"ContainerDied","Data":"9f30f2b78e5d8e9cc01be9ecaeed2640aa907657c838cd195fc7b422971ffaab"} Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.201058 4669 scope.go:117] "RemoveContainer" containerID="a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.201342 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgsq8" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.262251 4669 scope.go:117] "RemoveContainer" containerID="ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.300490 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgsq8"] Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.303866 4669 scope.go:117] "RemoveContainer" containerID="5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.318290 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hgsq8"] Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.366715 4669 scope.go:117] "RemoveContainer" containerID="a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3" Oct 01 11:54:28 crc kubenswrapper[4669]: E1001 11:54:28.367554 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3\": container with ID starting with a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3 not found: ID does not exist" containerID="a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.367690 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3"} err="failed to get container status \"a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3\": rpc error: code = NotFound desc = could not find container \"a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3\": container with ID starting with a3984fe30404a379a41b54ef683e59a3656db4a347d20c0973c40bee15d4dec3 not found: ID does not exist" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.367762 4669 scope.go:117] "RemoveContainer" containerID="ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67" Oct 01 11:54:28 crc kubenswrapper[4669]: E1001 11:54:28.368444 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67\": container with ID starting with ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67 not found: ID does not exist" containerID="ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.368511 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67"} err="failed to get container status \"ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67\": rpc error: code = NotFound desc = could not find container \"ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67\": container with ID starting with ba047693b60f0e1e687af1ebb7f8a8bd7d370c951c931a9a3a76724d2c2c7f67 not found: ID does not exist" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.368550 4669 scope.go:117] "RemoveContainer" containerID="5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed" Oct 01 11:54:28 crc kubenswrapper[4669]: E1001 11:54:28.369098 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed\": container with ID starting with 5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed not found: ID does not exist" containerID="5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed" Oct 01 11:54:28 crc kubenswrapper[4669]: I1001 11:54:28.369169 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed"} err="failed to get container status \"5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed\": rpc error: code = NotFound desc = could not find container \"5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed\": container with ID starting with 5544291209fae1af20cb164e62a2f081acd242e73c7e23dc7d8988ae00d564ed not found: ID does not exist" Oct 01 11:54:29 crc kubenswrapper[4669]: I1001 11:54:29.666737 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b474be11-cbe7-4c80-a666-603d792dde90" path="/var/lib/kubelet/pods/b474be11-cbe7-4c80-a666-603d792dde90/volumes" Oct 01 11:54:52 crc kubenswrapper[4669]: I1001 11:54:52.074156 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5s8rb"] Oct 01 11:54:52 crc kubenswrapper[4669]: I1001 11:54:52.091819 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5s8rb"] Oct 01 11:54:53 crc kubenswrapper[4669]: I1001 11:54:53.675693 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cd23a9-752d-439c-9630-94967cef4a4f" path="/var/lib/kubelet/pods/02cd23a9-752d-439c-9630-94967cef4a4f/volumes" Oct 01 11:54:56 crc kubenswrapper[4669]: I1001 11:54:56.092607 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-spqkd"] Oct 01 11:54:56 crc kubenswrapper[4669]: I1001 11:54:56.107639 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-f24vb"] Oct 01 11:54:56 crc kubenswrapper[4669]: I1001 11:54:56.124666 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-spqkd"] Oct 01 11:54:56 crc kubenswrapper[4669]: I1001 11:54:56.145763 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-f24vb"] Oct 01 11:54:57 crc kubenswrapper[4669]: I1001 11:54:57.657120 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3362993f-ccb2-4f32-935f-8fd98745982e" path="/var/lib/kubelet/pods/3362993f-ccb2-4f32-935f-8fd98745982e/volumes" Oct 01 11:54:57 crc kubenswrapper[4669]: I1001 11:54:57.657828 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5511e00f-0d66-4990-a6b2-7223178d2806" path="/var/lib/kubelet/pods/5511e00f-0d66-4990-a6b2-7223178d2806/volumes" Oct 01 11:55:02 crc kubenswrapper[4669]: I1001 11:55:02.061280 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5df9-account-create-j88kr"] Oct 01 11:55:02 crc kubenswrapper[4669]: I1001 11:55:02.081902 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5df9-account-create-j88kr"] Oct 01 11:55:03 crc kubenswrapper[4669]: I1001 11:55:03.667736 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce18eda-e988-4897-bd2f-8e656c93b271" path="/var/lib/kubelet/pods/6ce18eda-e988-4897-bd2f-8e656c93b271/volumes" Oct 01 11:55:06 crc kubenswrapper[4669]: I1001 11:55:06.042796 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5593-account-create-gvtw4"] Oct 01 11:55:06 crc kubenswrapper[4669]: I1001 11:55:06.055138 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5593-account-create-gvtw4"] Oct 01 11:55:06 crc kubenswrapper[4669]: I1001 11:55:06.066528 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-392f-account-create-vvqc8"] Oct 01 11:55:06 crc kubenswrapper[4669]: I1001 11:55:06.077894 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-392f-account-create-vvqc8"] Oct 01 11:55:07 crc kubenswrapper[4669]: I1001 11:55:07.661402 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c280659a-1e6a-4f60-b793-0147fe1b4ecf" path="/var/lib/kubelet/pods/c280659a-1e6a-4f60-b793-0147fe1b4ecf/volumes" Oct 01 11:55:07 crc kubenswrapper[4669]: I1001 11:55:07.663484 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf38126-0bdb-4a19-86d7-469597350f4f" path="/var/lib/kubelet/pods/ddf38126-0bdb-4a19-86d7-469597350f4f/volumes" Oct 01 11:55:13 crc kubenswrapper[4669]: I1001 11:55:13.041893 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5srjm"] Oct 01 11:55:13 crc kubenswrapper[4669]: I1001 11:55:13.050934 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5srjm"] Oct 01 11:55:13 crc kubenswrapper[4669]: I1001 11:55:13.657884 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48969021-b816-4ae7-a52c-f26845df0580" path="/var/lib/kubelet/pods/48969021-b816-4ae7-a52c-f26845df0580/volumes" Oct 01 11:55:16 crc kubenswrapper[4669]: I1001 11:55:16.041941 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jpbz8"] Oct 01 11:55:16 crc kubenswrapper[4669]: I1001 11:55:16.055657 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4stcf"] Oct 01 11:55:16 crc kubenswrapper[4669]: I1001 11:55:16.072383 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4stcf"] Oct 01 11:55:16 crc kubenswrapper[4669]: I1001 11:55:16.087983 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jpbz8"] Oct 01 11:55:17 crc kubenswrapper[4669]: I1001 11:55:17.663007 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60db4b61-e005-45c2-a41c-9ba9e7709a90" path="/var/lib/kubelet/pods/60db4b61-e005-45c2-a41c-9ba9e7709a90/volumes" Oct 01 11:55:17 crc kubenswrapper[4669]: I1001 11:55:17.664715 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7377422-847a-48cf-9248-7126e2fda461" path="/var/lib/kubelet/pods/b7377422-847a-48cf-9248-7126e2fda461/volumes" Oct 01 11:55:31 crc kubenswrapper[4669]: I1001 11:55:31.048622 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8dae-account-create-vx5r9"] Oct 01 11:55:31 crc kubenswrapper[4669]: I1001 11:55:31.059155 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8dae-account-create-vx5r9"] Oct 01 11:55:31 crc kubenswrapper[4669]: I1001 11:55:31.668739 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4e7624-e1b8-47b7-a7de-5566e2180147" path="/var/lib/kubelet/pods/9b4e7624-e1b8-47b7-a7de-5566e2180147/volumes" Oct 01 11:55:31 crc kubenswrapper[4669]: I1001 11:55:31.864884 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:55:31 crc kubenswrapper[4669]: I1001 11:55:31.864978 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.480322 4669 scope.go:117] "RemoveContainer" containerID="c049882bbc6dbbffe2abad1a14c6f12258de9c60de2cba855e31e56e5ed32b4b" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.534189 4669 scope.go:117] "RemoveContainer" containerID="fafbf17a957a30559a9f42b05f5c9e0630ba324f518feff44d8cdb84b167233b" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.579875 4669 scope.go:117] "RemoveContainer" containerID="03a0061ec24c0c5ec4f3119fa161aec560d6d14ccdf5805e83305e638168f575" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.640439 4669 scope.go:117] "RemoveContainer" containerID="eaffd0b2a5721d1440159064d5ae31f27b5c8556da99e327b69e1af73951ee1a" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.706395 4669 scope.go:117] "RemoveContainer" containerID="47c2c32808c1d5ac1efdf38d9c27b318d984ef8a5eb4378fb124006bd1d7be47" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.758830 4669 scope.go:117] "RemoveContainer" containerID="979c5a5080442e14529cd058c181b2ec8bedca90c7e9b585e65c10266945e3e2" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.813038 4669 scope.go:117] "RemoveContainer" containerID="3774d6df56756e5ff37df2699f23239a4813c9f5a4f6ab7863436d871ee0cef5" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.842683 4669 scope.go:117] "RemoveContainer" containerID="8712159fecc46b7da0621abb792f3ac80a839cf4e6caa9246907464a882aaafb" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.878758 4669 scope.go:117] "RemoveContainer" containerID="8542db483d60c269ea4a36a8741638b0b0bc3e35efa5c978c0cc59f783b9b7ed" Oct 01 11:55:33 crc kubenswrapper[4669]: I1001 11:55:33.919021 4669 scope.go:117] "RemoveContainer" containerID="45a93ebfaefd73fbf00378cd17f3e531c19a7d701114b2f94b4e28dea624906d" Oct 01 11:55:34 crc kubenswrapper[4669]: I1001 11:55:34.046884 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ttgc2"] Oct 01 11:55:34 crc kubenswrapper[4669]: I1001 11:55:34.055773 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ttgc2"] Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.044513 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5197-account-create-5fftd"] Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.055921 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a0d9-account-create-m6bsh"] Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.072763 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5197-account-create-5fftd"] Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.085156 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a0d9-account-create-m6bsh"] Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.670302 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e5eb36-78d1-4d8c-85ec-330fae011103" path="/var/lib/kubelet/pods/22e5eb36-78d1-4d8c-85ec-330fae011103/volumes" Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.671573 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3a2882-862b-4f3d-91e2-0f18d1960a91" path="/var/lib/kubelet/pods/8b3a2882-862b-4f3d-91e2-0f18d1960a91/volumes" Oct 01 11:55:35 crc kubenswrapper[4669]: I1001 11:55:35.672691 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2aebeb-0425-4f89-b3a9-be541ae1e07c" path="/var/lib/kubelet/pods/ca2aebeb-0425-4f89-b3a9-be541ae1e07c/volumes" Oct 01 11:55:42 crc kubenswrapper[4669]: I1001 11:55:42.174137 4669 generic.go:334] "Generic (PLEG): container finished" podID="261f1c48-3c07-495d-b916-861c2a1943d8" containerID="925514b17efcda5c28965e0f6423f45b6872ed4dd95bd1981ec21f4888c90264" exitCode=0 Oct 01 11:55:42 crc kubenswrapper[4669]: I1001 11:55:42.174371 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" event={"ID":"261f1c48-3c07-495d-b916-861c2a1943d8","Type":"ContainerDied","Data":"925514b17efcda5c28965e0f6423f45b6872ed4dd95bd1981ec21f4888c90264"} Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.670484 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.859749 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-ssh-key\") pod \"261f1c48-3c07-495d-b916-861c2a1943d8\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.859856 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-inventory\") pod \"261f1c48-3c07-495d-b916-861c2a1943d8\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.860105 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87czz\" (UniqueName: \"kubernetes.io/projected/261f1c48-3c07-495d-b916-861c2a1943d8-kube-api-access-87czz\") pod \"261f1c48-3c07-495d-b916-861c2a1943d8\" (UID: \"261f1c48-3c07-495d-b916-861c2a1943d8\") " Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.869426 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261f1c48-3c07-495d-b916-861c2a1943d8-kube-api-access-87czz" (OuterVolumeSpecName: "kube-api-access-87czz") pod "261f1c48-3c07-495d-b916-861c2a1943d8" (UID: "261f1c48-3c07-495d-b916-861c2a1943d8"). InnerVolumeSpecName "kube-api-access-87czz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.897432 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "261f1c48-3c07-495d-b916-861c2a1943d8" (UID: "261f1c48-3c07-495d-b916-861c2a1943d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.901002 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-inventory" (OuterVolumeSpecName: "inventory") pod "261f1c48-3c07-495d-b916-861c2a1943d8" (UID: "261f1c48-3c07-495d-b916-861c2a1943d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.962758 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.962817 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/261f1c48-3c07-495d-b916-861c2a1943d8-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:55:43 crc kubenswrapper[4669]: I1001 11:55:43.962828 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87czz\" (UniqueName: \"kubernetes.io/projected/261f1c48-3c07-495d-b916-861c2a1943d8-kube-api-access-87czz\") on node \"crc\" DevicePath \"\"" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.203794 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" event={"ID":"261f1c48-3c07-495d-b916-861c2a1943d8","Type":"ContainerDied","Data":"560c9dc2237377b3750e5941fa48dba879a5b47f15012cdc9985e10fd38d06c9"} Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.204290 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="560c9dc2237377b3750e5941fa48dba879a5b47f15012cdc9985e10fd38d06c9" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.203888 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rvw82" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.309271 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc"] Oct 01 11:55:44 crc kubenswrapper[4669]: E1001 11:55:44.310323 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="registry-server" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.310464 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="registry-server" Oct 01 11:55:44 crc kubenswrapper[4669]: E1001 11:55:44.310555 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="extract-content" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.310638 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="extract-content" Oct 01 11:55:44 crc kubenswrapper[4669]: E1001 11:55:44.310729 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261f1c48-3c07-495d-b916-861c2a1943d8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.310802 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="261f1c48-3c07-495d-b916-861c2a1943d8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 11:55:44 crc kubenswrapper[4669]: E1001 11:55:44.310900 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="extract-utilities" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.310973 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="extract-utilities" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.311319 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b474be11-cbe7-4c80-a666-603d792dde90" containerName="registry-server" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.311424 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="261f1c48-3c07-495d-b916-861c2a1943d8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.312449 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.315006 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.320898 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.320913 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.320927 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.323450 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc"] Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.474666 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.475352 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ls8\" (UniqueName: \"kubernetes.io/projected/d753b30d-e1c5-45b9-8d78-767dd0cadaea-kube-api-access-v8ls8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.475630 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.578486 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.579047 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ls8\" (UniqueName: \"kubernetes.io/projected/d753b30d-e1c5-45b9-8d78-767dd0cadaea-kube-api-access-v8ls8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.579356 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.583729 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.583883 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.606695 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ls8\" (UniqueName: \"kubernetes.io/projected/d753b30d-e1c5-45b9-8d78-767dd0cadaea-kube-api-access-v8ls8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knpjc\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:44 crc kubenswrapper[4669]: I1001 11:55:44.642667 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:55:45 crc kubenswrapper[4669]: I1001 11:55:45.279976 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc"] Oct 01 11:55:45 crc kubenswrapper[4669]: I1001 11:55:45.282504 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:55:46 crc kubenswrapper[4669]: I1001 11:55:46.232313 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" event={"ID":"d753b30d-e1c5-45b9-8d78-767dd0cadaea","Type":"ContainerStarted","Data":"6f38e8761efc437567d47c3fd7bb13dd96b63ab7ca5faefe3383aea85556e74c"} Oct 01 11:55:46 crc kubenswrapper[4669]: I1001 11:55:46.233214 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" event={"ID":"d753b30d-e1c5-45b9-8d78-767dd0cadaea","Type":"ContainerStarted","Data":"13879f60b68402c28a88ed66bde53d87d03a136855e034337cd0ab66bfb00281"} Oct 01 11:55:46 crc kubenswrapper[4669]: I1001 11:55:46.265959 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" podStartSLOduration=1.822340208 podStartE2EDuration="2.26593479s" podCreationTimestamp="2025-10-01 11:55:44 +0000 UTC" firstStartedPulling="2025-10-01 11:55:45.282026234 +0000 UTC m=+1636.381591251" lastFinishedPulling="2025-10-01 11:55:45.725620816 +0000 UTC m=+1636.825185833" observedRunningTime="2025-10-01 11:55:46.256909085 +0000 UTC m=+1637.356474062" watchObservedRunningTime="2025-10-01 11:55:46.26593479 +0000 UTC m=+1637.365499767" Oct 01 11:56:01 crc kubenswrapper[4669]: I1001 11:56:01.864179 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:56:01 crc kubenswrapper[4669]: I1001 11:56:01.866307 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:56:17 crc kubenswrapper[4669]: I1001 11:56:17.081130 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-997b7"] Oct 01 11:56:17 crc kubenswrapper[4669]: I1001 11:56:17.095133 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-997b7"] Oct 01 11:56:17 crc kubenswrapper[4669]: I1001 11:56:17.658450 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a28038-27bc-4a9f-be99-657225a3b9e5" path="/var/lib/kubelet/pods/55a28038-27bc-4a9f-be99-657225a3b9e5/volumes" Oct 01 11:56:20 crc kubenswrapper[4669]: I1001 11:56:20.072888 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s89kf"] Oct 01 11:56:20 crc kubenswrapper[4669]: I1001 11:56:20.084136 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s89kf"] Oct 01 11:56:21 crc kubenswrapper[4669]: I1001 11:56:21.660139 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c85d289-ff7f-4b57-a54a-cb272dec58e2" path="/var/lib/kubelet/pods/6c85d289-ff7f-4b57-a54a-cb272dec58e2/volumes" Oct 01 11:56:28 crc kubenswrapper[4669]: I1001 11:56:28.036064 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2cbqn"] Oct 01 11:56:28 crc kubenswrapper[4669]: I1001 11:56:28.049612 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8r7vt"] Oct 01 11:56:28 crc kubenswrapper[4669]: I1001 11:56:28.059055 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2cbqn"] Oct 01 11:56:28 crc kubenswrapper[4669]: I1001 11:56:28.069408 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8r7vt"] Oct 01 11:56:29 crc kubenswrapper[4669]: I1001 11:56:29.046744 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nfxsr"] Oct 01 11:56:29 crc kubenswrapper[4669]: I1001 11:56:29.057213 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nfxsr"] Oct 01 11:56:29 crc kubenswrapper[4669]: I1001 11:56:29.671124 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15da5802-a63f-44a7-b5b2-9f85b62e6675" path="/var/lib/kubelet/pods/15da5802-a63f-44a7-b5b2-9f85b62e6675/volumes" Oct 01 11:56:29 crc kubenswrapper[4669]: I1001 11:56:29.673347 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4814501d-3b55-40bb-b932-41f91ca1d7fb" path="/var/lib/kubelet/pods/4814501d-3b55-40bb-b932-41f91ca1d7fb/volumes" Oct 01 11:56:29 crc kubenswrapper[4669]: I1001 11:56:29.674406 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3395f1-0549-4bc4-a145-42ff20c37da6" path="/var/lib/kubelet/pods/ee3395f1-0549-4bc4-a145-42ff20c37da6/volumes" Oct 01 11:56:31 crc kubenswrapper[4669]: I1001 11:56:31.863661 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:56:31 crc kubenswrapper[4669]: I1001 11:56:31.864245 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:56:31 crc kubenswrapper[4669]: I1001 11:56:31.864322 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 11:56:31 crc kubenswrapper[4669]: I1001 11:56:31.865536 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:56:31 crc kubenswrapper[4669]: I1001 11:56:31.865628 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" gracePeriod=600 Oct 01 11:56:32 crc kubenswrapper[4669]: E1001 11:56:32.036292 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:56:32 crc kubenswrapper[4669]: I1001 11:56:32.807376 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" exitCode=0 Oct 01 11:56:32 crc kubenswrapper[4669]: I1001 11:56:32.807448 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55"} Oct 01 11:56:32 crc kubenswrapper[4669]: I1001 11:56:32.807530 4669 scope.go:117] "RemoveContainer" containerID="7b1236276e91901ca356b23317942bcba8b16d3a037aab0000d2acb95db6570b" Oct 01 11:56:32 crc kubenswrapper[4669]: I1001 11:56:32.808733 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:56:32 crc kubenswrapper[4669]: E1001 11:56:32.809319 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.196592 4669 scope.go:117] "RemoveContainer" containerID="4165672b7ed2689527620aa0e5fe0c14b451ea5de820cb08dc80918417dcde21" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.246030 4669 scope.go:117] "RemoveContainer" containerID="3fefcc43736ea6e549f0effcb874d8eaf368f5dc8c96e5a31de2e1e49dab6292" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.324126 4669 scope.go:117] "RemoveContainer" containerID="609b41669b067291745a6bbf95e71d443920b227d4e31771406a5637776767ba" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.365259 4669 scope.go:117] "RemoveContainer" containerID="86aa968b793f6e17beb2a4767540d3201a44fefdd39639ca30686796c0f121a5" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.415109 4669 scope.go:117] "RemoveContainer" containerID="c24bbdba316afc8a8949f202d2513184086ae3dcafd406e7fbfc58eeb61fd282" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.484574 4669 scope.go:117] "RemoveContainer" containerID="63f3bc3c7a3e9ae9344dc85fbe8e3df0db83cb54290f6006f17fe64976a0c746" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.530976 4669 scope.go:117] "RemoveContainer" containerID="0ae6a7126e7381ab1718ccd6b3e2637a6ce2cd81c70734a872755d4a57a58bd9" Oct 01 11:56:34 crc kubenswrapper[4669]: I1001 11:56:34.580639 4669 scope.go:117] "RemoveContainer" containerID="efd7eb073d20d63343033600c20646c1007af88a45c40f636ac0309656383f1f" Oct 01 11:56:46 crc kubenswrapper[4669]: I1001 11:56:46.040548 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-h6rw6"] Oct 01 11:56:46 crc kubenswrapper[4669]: I1001 11:56:46.053115 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-h6rw6"] Oct 01 11:56:47 crc kubenswrapper[4669]: I1001 11:56:47.644748 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:56:47 crc kubenswrapper[4669]: E1001 11:56:47.645683 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:56:47 crc kubenswrapper[4669]: I1001 11:56:47.665531 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2bb6cb-ab40-4534-967e-c71b62323512" path="/var/lib/kubelet/pods/db2bb6cb-ab40-4534-967e-c71b62323512/volumes" Oct 01 11:57:01 crc kubenswrapper[4669]: I1001 11:57:01.644175 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:57:01 crc kubenswrapper[4669]: E1001 11:57:01.645337 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:57:04 crc kubenswrapper[4669]: I1001 11:57:04.226731 4669 generic.go:334] "Generic (PLEG): container finished" podID="d753b30d-e1c5-45b9-8d78-767dd0cadaea" containerID="6f38e8761efc437567d47c3fd7bb13dd96b63ab7ca5faefe3383aea85556e74c" exitCode=0 Oct 01 11:57:04 crc kubenswrapper[4669]: I1001 11:57:04.226831 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" event={"ID":"d753b30d-e1c5-45b9-8d78-767dd0cadaea","Type":"ContainerDied","Data":"6f38e8761efc437567d47c3fd7bb13dd96b63ab7ca5faefe3383aea85556e74c"} Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.807114 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.911493 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8ls8\" (UniqueName: \"kubernetes.io/projected/d753b30d-e1c5-45b9-8d78-767dd0cadaea-kube-api-access-v8ls8\") pod \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.911595 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-inventory\") pod \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.911688 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-ssh-key\") pod \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\" (UID: \"d753b30d-e1c5-45b9-8d78-767dd0cadaea\") " Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.920998 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d753b30d-e1c5-45b9-8d78-767dd0cadaea-kube-api-access-v8ls8" (OuterVolumeSpecName: "kube-api-access-v8ls8") pod "d753b30d-e1c5-45b9-8d78-767dd0cadaea" (UID: "d753b30d-e1c5-45b9-8d78-767dd0cadaea"). InnerVolumeSpecName "kube-api-access-v8ls8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.951680 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-inventory" (OuterVolumeSpecName: "inventory") pod "d753b30d-e1c5-45b9-8d78-767dd0cadaea" (UID: "d753b30d-e1c5-45b9-8d78-767dd0cadaea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:57:05 crc kubenswrapper[4669]: I1001 11:57:05.956334 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d753b30d-e1c5-45b9-8d78-767dd0cadaea" (UID: "d753b30d-e1c5-45b9-8d78-767dd0cadaea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.015618 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.015670 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8ls8\" (UniqueName: \"kubernetes.io/projected/d753b30d-e1c5-45b9-8d78-767dd0cadaea-kube-api-access-v8ls8\") on node \"crc\" DevicePath \"\"" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.015692 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d753b30d-e1c5-45b9-8d78-767dd0cadaea-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.279021 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" event={"ID":"d753b30d-e1c5-45b9-8d78-767dd0cadaea","Type":"ContainerDied","Data":"13879f60b68402c28a88ed66bde53d87d03a136855e034337cd0ab66bfb00281"} Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.279119 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13879f60b68402c28a88ed66bde53d87d03a136855e034337cd0ab66bfb00281" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.279166 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knpjc" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.386616 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp"] Oct 01 11:57:06 crc kubenswrapper[4669]: E1001 11:57:06.387387 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d753b30d-e1c5-45b9-8d78-767dd0cadaea" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.387413 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d753b30d-e1c5-45b9-8d78-767dd0cadaea" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.387809 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d753b30d-e1c5-45b9-8d78-767dd0cadaea" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.388955 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.392005 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.394402 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.394804 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.395803 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.400952 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp"] Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.424068 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.424472 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgm8t\" (UniqueName: \"kubernetes.io/projected/74c54aa8-261e-4bad-babf-2838c6b49114-kube-api-access-tgm8t\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.424511 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.526767 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.526846 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgm8t\" (UniqueName: \"kubernetes.io/projected/74c54aa8-261e-4bad-babf-2838c6b49114-kube-api-access-tgm8t\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.526884 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.532215 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.535795 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.552459 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgm8t\" (UniqueName: \"kubernetes.io/projected/74c54aa8-261e-4bad-babf-2838c6b49114-kube-api-access-tgm8t\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-krxsp\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:06 crc kubenswrapper[4669]: I1001 11:57:06.718097 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:07 crc kubenswrapper[4669]: I1001 11:57:07.405803 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp"] Oct 01 11:57:07 crc kubenswrapper[4669]: W1001 11:57:07.414249 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c54aa8_261e_4bad_babf_2838c6b49114.slice/crio-21ab07ff8372fc25decd3b3bc1a53d03376962ee1ca8ae4525970758d65c9df4 WatchSource:0}: Error finding container 21ab07ff8372fc25decd3b3bc1a53d03376962ee1ca8ae4525970758d65c9df4: Status 404 returned error can't find the container with id 21ab07ff8372fc25decd3b3bc1a53d03376962ee1ca8ae4525970758d65c9df4 Oct 01 11:57:08 crc kubenswrapper[4669]: I1001 11:57:08.306660 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" event={"ID":"74c54aa8-261e-4bad-babf-2838c6b49114","Type":"ContainerStarted","Data":"21ab07ff8372fc25decd3b3bc1a53d03376962ee1ca8ae4525970758d65c9df4"} Oct 01 11:57:09 crc kubenswrapper[4669]: I1001 11:57:09.326750 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" event={"ID":"74c54aa8-261e-4bad-babf-2838c6b49114","Type":"ContainerStarted","Data":"f6a9fe423e229e9c5c3558f079883bb9f66bb69eb7c148fe110f598fab4c2c64"} Oct 01 11:57:09 crc kubenswrapper[4669]: I1001 11:57:09.361210 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" podStartSLOduration=2.559743601 podStartE2EDuration="3.361171988s" podCreationTimestamp="2025-10-01 11:57:06 +0000 UTC" firstStartedPulling="2025-10-01 11:57:07.419556988 +0000 UTC m=+1718.519121965" lastFinishedPulling="2025-10-01 11:57:08.220985335 +0000 UTC m=+1719.320550352" observedRunningTime="2025-10-01 11:57:09.355392934 +0000 UTC m=+1720.454957941" watchObservedRunningTime="2025-10-01 11:57:09.361171988 +0000 UTC m=+1720.460736995" Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.066057 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-b72rx"] Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.072249 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-x5w8f"] Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.079303 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8tkmw"] Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.086067 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-b72rx"] Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.095400 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-x5w8f"] Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.105284 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8tkmw"] Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.663594 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dbbc207-4a1d-40c3-8392-ebfc5def670a" path="/var/lib/kubelet/pods/7dbbc207-4a1d-40c3-8392-ebfc5def670a/volumes" Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.664326 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebda0307-643b-4933-aea3-a3ea9b534f50" path="/var/lib/kubelet/pods/ebda0307-643b-4933-aea3-a3ea9b534f50/volumes" Oct 01 11:57:13 crc kubenswrapper[4669]: I1001 11:57:13.664821 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0" path="/var/lib/kubelet/pods/f4d9cd99-48a5-4675-9a7a-258d0cfa5ee0/volumes" Oct 01 11:57:14 crc kubenswrapper[4669]: I1001 11:57:14.384376 4669 generic.go:334] "Generic (PLEG): container finished" podID="74c54aa8-261e-4bad-babf-2838c6b49114" containerID="f6a9fe423e229e9c5c3558f079883bb9f66bb69eb7c148fe110f598fab4c2c64" exitCode=0 Oct 01 11:57:14 crc kubenswrapper[4669]: I1001 11:57:14.384521 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" event={"ID":"74c54aa8-261e-4bad-babf-2838c6b49114","Type":"ContainerDied","Data":"f6a9fe423e229e9c5c3558f079883bb9f66bb69eb7c148fe110f598fab4c2c64"} Oct 01 11:57:14 crc kubenswrapper[4669]: I1001 11:57:14.644401 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:57:14 crc kubenswrapper[4669]: E1001 11:57:14.644922 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.002795 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.170267 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-ssh-key\") pod \"74c54aa8-261e-4bad-babf-2838c6b49114\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.170623 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-inventory\") pod \"74c54aa8-261e-4bad-babf-2838c6b49114\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.170810 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgm8t\" (UniqueName: \"kubernetes.io/projected/74c54aa8-261e-4bad-babf-2838c6b49114-kube-api-access-tgm8t\") pod \"74c54aa8-261e-4bad-babf-2838c6b49114\" (UID: \"74c54aa8-261e-4bad-babf-2838c6b49114\") " Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.181442 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c54aa8-261e-4bad-babf-2838c6b49114-kube-api-access-tgm8t" (OuterVolumeSpecName: "kube-api-access-tgm8t") pod "74c54aa8-261e-4bad-babf-2838c6b49114" (UID: "74c54aa8-261e-4bad-babf-2838c6b49114"). InnerVolumeSpecName "kube-api-access-tgm8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.225288 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-inventory" (OuterVolumeSpecName: "inventory") pod "74c54aa8-261e-4bad-babf-2838c6b49114" (UID: "74c54aa8-261e-4bad-babf-2838c6b49114"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.227196 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74c54aa8-261e-4bad-babf-2838c6b49114" (UID: "74c54aa8-261e-4bad-babf-2838c6b49114"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.274345 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.274387 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgm8t\" (UniqueName: \"kubernetes.io/projected/74c54aa8-261e-4bad-babf-2838c6b49114-kube-api-access-tgm8t\") on node \"crc\" DevicePath \"\"" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.274406 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74c54aa8-261e-4bad-babf-2838c6b49114-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.418284 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" event={"ID":"74c54aa8-261e-4bad-babf-2838c6b49114","Type":"ContainerDied","Data":"21ab07ff8372fc25decd3b3bc1a53d03376962ee1ca8ae4525970758d65c9df4"} Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.418374 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ab07ff8372fc25decd3b3bc1a53d03376962ee1ca8ae4525970758d65c9df4" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.418522 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-krxsp" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.537874 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr"] Oct 01 11:57:16 crc kubenswrapper[4669]: E1001 11:57:16.538404 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c54aa8-261e-4bad-babf-2838c6b49114" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.538424 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c54aa8-261e-4bad-babf-2838c6b49114" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.538658 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c54aa8-261e-4bad-babf-2838c6b49114" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.539399 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.543474 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.544043 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.544934 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.545238 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.551375 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr"] Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.684627 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h922\" (UniqueName: \"kubernetes.io/projected/b71b4047-5538-4132-9247-8b9b34e6979c-kube-api-access-5h922\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.685008 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.685057 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.788594 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h922\" (UniqueName: \"kubernetes.io/projected/b71b4047-5538-4132-9247-8b9b34e6979c-kube-api-access-5h922\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.788744 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.788841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.796567 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.810434 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.819779 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h922\" (UniqueName: \"kubernetes.io/projected/b71b4047-5538-4132-9247-8b9b34e6979c-kube-api-access-5h922\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-blscr\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:16 crc kubenswrapper[4669]: I1001 11:57:16.864380 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:57:17 crc kubenswrapper[4669]: I1001 11:57:17.509861 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr"] Oct 01 11:57:18 crc kubenswrapper[4669]: I1001 11:57:18.038374 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5926-account-create-lhhnq"] Oct 01 11:57:18 crc kubenswrapper[4669]: I1001 11:57:18.047257 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5926-account-create-lhhnq"] Oct 01 11:57:18 crc kubenswrapper[4669]: I1001 11:57:18.444120 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" event={"ID":"b71b4047-5538-4132-9247-8b9b34e6979c","Type":"ContainerStarted","Data":"5f09b275b21b3a369413a7adeb6063de580922446951f566be3f65912261339d"} Oct 01 11:57:18 crc kubenswrapper[4669]: I1001 11:57:18.444709 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" event={"ID":"b71b4047-5538-4132-9247-8b9b34e6979c","Type":"ContainerStarted","Data":"34d9820c43d6ed58e34e299f824737ba0ba6e53ef25c3d93bd16f36eb5a0014f"} Oct 01 11:57:18 crc kubenswrapper[4669]: I1001 11:57:18.485190 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" podStartSLOduration=2.057377668 podStartE2EDuration="2.485154656s" podCreationTimestamp="2025-10-01 11:57:16 +0000 UTC" firstStartedPulling="2025-10-01 11:57:17.525429582 +0000 UTC m=+1728.624994559" lastFinishedPulling="2025-10-01 11:57:17.95320653 +0000 UTC m=+1729.052771547" observedRunningTime="2025-10-01 11:57:18.469854145 +0000 UTC m=+1729.569419202" watchObservedRunningTime="2025-10-01 11:57:18.485154656 +0000 UTC m=+1729.584719673" Oct 01 11:57:19 crc kubenswrapper[4669]: I1001 11:57:19.049303 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a85c-account-create-qmwxm"] Oct 01 11:57:19 crc kubenswrapper[4669]: I1001 11:57:19.060304 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a85c-account-create-qmwxm"] Oct 01 11:57:19 crc kubenswrapper[4669]: I1001 11:57:19.665488 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e0ee30-1e97-4d8d-8f57-f94949a53291" path="/var/lib/kubelet/pods/35e0ee30-1e97-4d8d-8f57-f94949a53291/volumes" Oct 01 11:57:19 crc kubenswrapper[4669]: I1001 11:57:19.666703 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88c22d8-fe18-470e-87c4-9ef21beeccce" path="/var/lib/kubelet/pods/d88c22d8-fe18-470e-87c4-9ef21beeccce/volumes" Oct 01 11:57:25 crc kubenswrapper[4669]: I1001 11:57:25.644741 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:57:25 crc kubenswrapper[4669]: E1001 11:57:25.645772 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:57:33 crc kubenswrapper[4669]: I1001 11:57:33.040252 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c7ad-account-create-jf4nw"] Oct 01 11:57:33 crc kubenswrapper[4669]: I1001 11:57:33.047543 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c7ad-account-create-jf4nw"] Oct 01 11:57:33 crc kubenswrapper[4669]: I1001 11:57:33.662182 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b9cae74-a51f-4d18-949d-ca999e48f5e3" path="/var/lib/kubelet/pods/3b9cae74-a51f-4d18-949d-ca999e48f5e3/volumes" Oct 01 11:57:34 crc kubenswrapper[4669]: I1001 11:57:34.820734 4669 scope.go:117] "RemoveContainer" containerID="4505d76d3141066e11a4ca1c6d2d359b10840e42c9705e611f9ebb14844fa9c2" Oct 01 11:57:34 crc kubenswrapper[4669]: I1001 11:57:34.857605 4669 scope.go:117] "RemoveContainer" containerID="7cb5359763691dde1e21a6058bd3daddb4c1ba78f9bc344afb0bbddd8356ed6d" Oct 01 11:57:34 crc kubenswrapper[4669]: I1001 11:57:34.924948 4669 scope.go:117] "RemoveContainer" containerID="f67efb9a0c894ae8c54a4f8cfe6605d05de698063188f932a48676a902d4fa02" Oct 01 11:57:35 crc kubenswrapper[4669]: I1001 11:57:35.008344 4669 scope.go:117] "RemoveContainer" containerID="e7a3efef00b92e67553b4b954621149b11ce00a8f4ed592ee39909410c19d004" Oct 01 11:57:35 crc kubenswrapper[4669]: I1001 11:57:35.062323 4669 scope.go:117] "RemoveContainer" containerID="baa275b2a84c20c97af4d6ca9eba17ead71a873fa28209b883370788d9055296" Oct 01 11:57:35 crc kubenswrapper[4669]: I1001 11:57:35.089604 4669 scope.go:117] "RemoveContainer" containerID="6c758ca5869845c8abc0c014a267fcfc25f4243a9ec74a2860849eb6ccea90e0" Oct 01 11:57:35 crc kubenswrapper[4669]: I1001 11:57:35.135352 4669 scope.go:117] "RemoveContainer" containerID="5ab148946e5438398447ef1a5981acc07e37f691e0d3f16990af903bb543284b" Oct 01 11:57:38 crc kubenswrapper[4669]: I1001 11:57:38.644982 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:57:38 crc kubenswrapper[4669]: E1001 11:57:38.645886 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:57:48 crc kubenswrapper[4669]: I1001 11:57:48.048814 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gcjc4"] Oct 01 11:57:48 crc kubenswrapper[4669]: I1001 11:57:48.063714 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gcjc4"] Oct 01 11:57:49 crc kubenswrapper[4669]: I1001 11:57:49.667885 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:57:49 crc kubenswrapper[4669]: E1001 11:57:49.671333 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:57:49 crc kubenswrapper[4669]: I1001 11:57:49.681174 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699259f2-9bb3-42f1-b04f-d95ab275e1aa" path="/var/lib/kubelet/pods/699259f2-9bb3-42f1-b04f-d95ab275e1aa/volumes" Oct 01 11:58:04 crc kubenswrapper[4669]: I1001 11:58:04.644668 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:58:04 crc kubenswrapper[4669]: E1001 11:58:04.645798 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:58:05 crc kubenswrapper[4669]: I1001 11:58:05.064811 4669 generic.go:334] "Generic (PLEG): container finished" podID="b71b4047-5538-4132-9247-8b9b34e6979c" containerID="5f09b275b21b3a369413a7adeb6063de580922446951f566be3f65912261339d" exitCode=0 Oct 01 11:58:05 crc kubenswrapper[4669]: I1001 11:58:05.064910 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" event={"ID":"b71b4047-5538-4132-9247-8b9b34e6979c","Type":"ContainerDied","Data":"5f09b275b21b3a369413a7adeb6063de580922446951f566be3f65912261339d"} Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.623309 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.754778 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-ssh-key\") pod \"b71b4047-5538-4132-9247-8b9b34e6979c\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.754962 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-inventory\") pod \"b71b4047-5538-4132-9247-8b9b34e6979c\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.755045 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h922\" (UniqueName: \"kubernetes.io/projected/b71b4047-5538-4132-9247-8b9b34e6979c-kube-api-access-5h922\") pod \"b71b4047-5538-4132-9247-8b9b34e6979c\" (UID: \"b71b4047-5538-4132-9247-8b9b34e6979c\") " Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.761678 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71b4047-5538-4132-9247-8b9b34e6979c-kube-api-access-5h922" (OuterVolumeSpecName: "kube-api-access-5h922") pod "b71b4047-5538-4132-9247-8b9b34e6979c" (UID: "b71b4047-5538-4132-9247-8b9b34e6979c"). InnerVolumeSpecName "kube-api-access-5h922". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.792746 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b71b4047-5538-4132-9247-8b9b34e6979c" (UID: "b71b4047-5538-4132-9247-8b9b34e6979c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.816156 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-inventory" (OuterVolumeSpecName: "inventory") pod "b71b4047-5538-4132-9247-8b9b34e6979c" (UID: "b71b4047-5538-4132-9247-8b9b34e6979c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.857902 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.857953 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h922\" (UniqueName: \"kubernetes.io/projected/b71b4047-5538-4132-9247-8b9b34e6979c-kube-api-access-5h922\") on node \"crc\" DevicePath \"\"" Oct 01 11:58:06 crc kubenswrapper[4669]: I1001 11:58:06.857974 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b71b4047-5538-4132-9247-8b9b34e6979c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.090152 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" event={"ID":"b71b4047-5538-4132-9247-8b9b34e6979c","Type":"ContainerDied","Data":"34d9820c43d6ed58e34e299f824737ba0ba6e53ef25c3d93bd16f36eb5a0014f"} Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.090200 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34d9820c43d6ed58e34e299f824737ba0ba6e53ef25c3d93bd16f36eb5a0014f" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.090233 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-blscr" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.207218 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99"] Oct 01 11:58:07 crc kubenswrapper[4669]: E1001 11:58:07.207875 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71b4047-5538-4132-9247-8b9b34e6979c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.207908 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71b4047-5538-4132-9247-8b9b34e6979c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.208231 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71b4047-5538-4132-9247-8b9b34e6979c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.209365 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.214050 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.214400 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.214452 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.214637 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.231385 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99"] Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.370526 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.370953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpthv\" (UniqueName: \"kubernetes.io/projected/667c6c9f-b26e-4edb-b3f7-5d7241afb839-kube-api-access-zpthv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.371139 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.475524 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpthv\" (UniqueName: \"kubernetes.io/projected/667c6c9f-b26e-4edb-b3f7-5d7241afb839-kube-api-access-zpthv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.475911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.480645 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.482962 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.489409 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.506888 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpthv\" (UniqueName: \"kubernetes.io/projected/667c6c9f-b26e-4edb-b3f7-5d7241afb839-kube-api-access-zpthv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tpr99\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:07 crc kubenswrapper[4669]: I1001 11:58:07.535797 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:58:08 crc kubenswrapper[4669]: I1001 11:58:08.142541 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99"] Oct 01 11:58:09 crc kubenswrapper[4669]: I1001 11:58:09.113772 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" event={"ID":"667c6c9f-b26e-4edb-b3f7-5d7241afb839","Type":"ContainerStarted","Data":"c38cac34d4e884f0cbbf213a22c12e3c547e66cf1a95564fb8109bb3ee5b93eb"} Oct 01 11:58:09 crc kubenswrapper[4669]: I1001 11:58:09.114123 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" event={"ID":"667c6c9f-b26e-4edb-b3f7-5d7241afb839","Type":"ContainerStarted","Data":"76e7a06b52502bd3425fb424c6bc48e87b5a58d1385f38c71dd3dd905799c628"} Oct 01 11:58:09 crc kubenswrapper[4669]: I1001 11:58:09.137896 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" podStartSLOduration=1.645851285 podStartE2EDuration="2.137872646s" podCreationTimestamp="2025-10-01 11:58:07 +0000 UTC" firstStartedPulling="2025-10-01 11:58:08.153474106 +0000 UTC m=+1779.253039123" lastFinishedPulling="2025-10-01 11:58:08.645495497 +0000 UTC m=+1779.745060484" observedRunningTime="2025-10-01 11:58:09.131821135 +0000 UTC m=+1780.231386122" watchObservedRunningTime="2025-10-01 11:58:09.137872646 +0000 UTC m=+1780.237437623" Oct 01 11:58:13 crc kubenswrapper[4669]: I1001 11:58:13.062682 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-csn4t"] Oct 01 11:58:13 crc kubenswrapper[4669]: I1001 11:58:13.073565 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-csn4t"] Oct 01 11:58:13 crc kubenswrapper[4669]: I1001 11:58:13.662364 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0" path="/var/lib/kubelet/pods/9f73cdc5-5e4b-4242-a588-2b8b4aa2f1b0/volumes" Oct 01 11:58:15 crc kubenswrapper[4669]: I1001 11:58:15.645095 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:58:15 crc kubenswrapper[4669]: E1001 11:58:15.646395 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:58:16 crc kubenswrapper[4669]: I1001 11:58:16.043333 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7d7q"] Oct 01 11:58:16 crc kubenswrapper[4669]: I1001 11:58:16.052605 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7d7q"] Oct 01 11:58:17 crc kubenswrapper[4669]: I1001 11:58:17.663546 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688eb6a7-b463-4b36-9ef7-a365cbabac1f" path="/var/lib/kubelet/pods/688eb6a7-b463-4b36-9ef7-a365cbabac1f/volumes" Oct 01 11:58:26 crc kubenswrapper[4669]: I1001 11:58:26.644461 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:58:26 crc kubenswrapper[4669]: E1001 11:58:26.645439 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:58:35 crc kubenswrapper[4669]: I1001 11:58:35.307489 4669 scope.go:117] "RemoveContainer" containerID="9c1f9528cc2e1978589e5ae1df339e0af1628a5d9c6e8153c4b394438d736741" Oct 01 11:58:35 crc kubenswrapper[4669]: I1001 11:58:35.391448 4669 scope.go:117] "RemoveContainer" containerID="1824df4216d73949ef552fe39392a129a7190f6916b89dfbc645ae424399fdb1" Oct 01 11:58:35 crc kubenswrapper[4669]: I1001 11:58:35.470883 4669 scope.go:117] "RemoveContainer" containerID="3a19d270666002f23c4ec0dec5dac48b73eb4ac725d66a0ed47deb1b8a32f2fa" Oct 01 11:58:37 crc kubenswrapper[4669]: I1001 11:58:37.644894 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:58:37 crc kubenswrapper[4669]: E1001 11:58:37.645485 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:58:51 crc kubenswrapper[4669]: I1001 11:58:51.645581 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:58:51 crc kubenswrapper[4669]: E1001 11:58:51.647018 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:58:58 crc kubenswrapper[4669]: I1001 11:58:58.044862 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hdx2s"] Oct 01 11:58:58 crc kubenswrapper[4669]: I1001 11:58:58.055674 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hdx2s"] Oct 01 11:58:59 crc kubenswrapper[4669]: I1001 11:58:59.681965 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694af0ac-d829-4caa-8350-1861400d0438" path="/var/lib/kubelet/pods/694af0ac-d829-4caa-8350-1861400d0438/volumes" Oct 01 11:59:02 crc kubenswrapper[4669]: I1001 11:59:02.645969 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:59:02 crc kubenswrapper[4669]: E1001 11:59:02.647117 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:59:09 crc kubenswrapper[4669]: I1001 11:59:09.889021 4669 generic.go:334] "Generic (PLEG): container finished" podID="667c6c9f-b26e-4edb-b3f7-5d7241afb839" containerID="c38cac34d4e884f0cbbf213a22c12e3c547e66cf1a95564fb8109bb3ee5b93eb" exitCode=2 Oct 01 11:59:09 crc kubenswrapper[4669]: I1001 11:59:09.889121 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" event={"ID":"667c6c9f-b26e-4edb-b3f7-5d7241afb839","Type":"ContainerDied","Data":"c38cac34d4e884f0cbbf213a22c12e3c547e66cf1a95564fb8109bb3ee5b93eb"} Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.469298 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.636505 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpthv\" (UniqueName: \"kubernetes.io/projected/667c6c9f-b26e-4edb-b3f7-5d7241afb839-kube-api-access-zpthv\") pod \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.636600 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-inventory\") pod \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.636794 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-ssh-key\") pod \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\" (UID: \"667c6c9f-b26e-4edb-b3f7-5d7241afb839\") " Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.646889 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667c6c9f-b26e-4edb-b3f7-5d7241afb839-kube-api-access-zpthv" (OuterVolumeSpecName: "kube-api-access-zpthv") pod "667c6c9f-b26e-4edb-b3f7-5d7241afb839" (UID: "667c6c9f-b26e-4edb-b3f7-5d7241afb839"). InnerVolumeSpecName "kube-api-access-zpthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.689313 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-inventory" (OuterVolumeSpecName: "inventory") pod "667c6c9f-b26e-4edb-b3f7-5d7241afb839" (UID: "667c6c9f-b26e-4edb-b3f7-5d7241afb839"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.691565 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "667c6c9f-b26e-4edb-b3f7-5d7241afb839" (UID: "667c6c9f-b26e-4edb-b3f7-5d7241afb839"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.740165 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpthv\" (UniqueName: \"kubernetes.io/projected/667c6c9f-b26e-4edb-b3f7-5d7241afb839-kube-api-access-zpthv\") on node \"crc\" DevicePath \"\"" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.740947 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.741115 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/667c6c9f-b26e-4edb-b3f7-5d7241afb839-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.910587 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" event={"ID":"667c6c9f-b26e-4edb-b3f7-5d7241afb839","Type":"ContainerDied","Data":"76e7a06b52502bd3425fb424c6bc48e87b5a58d1385f38c71dd3dd905799c628"} Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.910892 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76e7a06b52502bd3425fb424c6bc48e87b5a58d1385f38c71dd3dd905799c628" Oct 01 11:59:11 crc kubenswrapper[4669]: I1001 11:59:11.910640 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tpr99" Oct 01 11:59:15 crc kubenswrapper[4669]: I1001 11:59:15.645392 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:59:15 crc kubenswrapper[4669]: E1001 11:59:15.646483 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.037393 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4"] Oct 01 11:59:18 crc kubenswrapper[4669]: E1001 11:59:18.038250 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667c6c9f-b26e-4edb-b3f7-5d7241afb839" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.038269 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="667c6c9f-b26e-4edb-b3f7-5d7241afb839" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.038505 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="667c6c9f-b26e-4edb-b3f7-5d7241afb839" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.039357 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.046426 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.046740 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.047416 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.047824 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.049914 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4"] Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.188853 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.188972 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mjl\" (UniqueName: \"kubernetes.io/projected/bee90766-2c6f-4f88-a17d-33098d6599a9-kube-api-access-p7mjl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.189041 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.290967 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.291059 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mjl\" (UniqueName: \"kubernetes.io/projected/bee90766-2c6f-4f88-a17d-33098d6599a9-kube-api-access-p7mjl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.291207 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.298944 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.301213 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.318659 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mjl\" (UniqueName: \"kubernetes.io/projected/bee90766-2c6f-4f88-a17d-33098d6599a9-kube-api-access-p7mjl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m55t4\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.408700 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.963161 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4"] Oct 01 11:59:18 crc kubenswrapper[4669]: I1001 11:59:18.995811 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" event={"ID":"bee90766-2c6f-4f88-a17d-33098d6599a9","Type":"ContainerStarted","Data":"78dc8ce1e444835d394341b9fd0fd20319ac9c3c3e1d61ba26b305e47f0acd2d"} Oct 01 11:59:20 crc kubenswrapper[4669]: I1001 11:59:20.014959 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" event={"ID":"bee90766-2c6f-4f88-a17d-33098d6599a9","Type":"ContainerStarted","Data":"3da958d66115b7ace9d83211899f3fa5497acaa55579f04403d14b7b41355498"} Oct 01 11:59:20 crc kubenswrapper[4669]: I1001 11:59:20.036871 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" podStartSLOduration=1.385099927 podStartE2EDuration="2.036844681s" podCreationTimestamp="2025-10-01 11:59:18 +0000 UTC" firstStartedPulling="2025-10-01 11:59:18.965132924 +0000 UTC m=+1850.064697911" lastFinishedPulling="2025-10-01 11:59:19.616877648 +0000 UTC m=+1850.716442665" observedRunningTime="2025-10-01 11:59:20.031026785 +0000 UTC m=+1851.130591762" watchObservedRunningTime="2025-10-01 11:59:20.036844681 +0000 UTC m=+1851.136409658" Oct 01 11:59:26 crc kubenswrapper[4669]: I1001 11:59:26.644109 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:59:26 crc kubenswrapper[4669]: E1001 11:59:26.644846 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:59:35 crc kubenswrapper[4669]: I1001 11:59:35.596628 4669 scope.go:117] "RemoveContainer" containerID="c764c22f963b6d085d64e6708f92d8490dcedce3a054b33c5e988108bff0293b" Oct 01 11:59:37 crc kubenswrapper[4669]: I1001 11:59:37.644750 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:59:37 crc kubenswrapper[4669]: E1001 11:59:37.645568 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 11:59:48 crc kubenswrapper[4669]: I1001 11:59:48.645558 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 11:59:48 crc kubenswrapper[4669]: E1001 11:59:48.646848 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.169654 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784"] Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.173774 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.177047 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.178415 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.181044 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784"] Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.246227 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51feb999-df70-4811-a60f-ae7968fbd9d1-config-volume\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.246335 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl27r\" (UniqueName: \"kubernetes.io/projected/51feb999-df70-4811-a60f-ae7968fbd9d1-kube-api-access-sl27r\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.246497 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51feb999-df70-4811-a60f-ae7968fbd9d1-secret-volume\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.349279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51feb999-df70-4811-a60f-ae7968fbd9d1-secret-volume\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.349463 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51feb999-df70-4811-a60f-ae7968fbd9d1-config-volume\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.349610 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl27r\" (UniqueName: \"kubernetes.io/projected/51feb999-df70-4811-a60f-ae7968fbd9d1-kube-api-access-sl27r\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.351373 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51feb999-df70-4811-a60f-ae7968fbd9d1-config-volume\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.360885 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51feb999-df70-4811-a60f-ae7968fbd9d1-secret-volume\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.372164 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl27r\" (UniqueName: \"kubernetes.io/projected/51feb999-df70-4811-a60f-ae7968fbd9d1-kube-api-access-sl27r\") pod \"collect-profiles-29322000-jw784\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:00 crc kubenswrapper[4669]: I1001 12:00:00.514735 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:01 crc kubenswrapper[4669]: I1001 12:00:01.129870 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784"] Oct 01 12:00:01 crc kubenswrapper[4669]: I1001 12:00:01.443647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" event={"ID":"51feb999-df70-4811-a60f-ae7968fbd9d1","Type":"ContainerStarted","Data":"d226940d390160316228fc7ac772f82db34dd49183ee5399d0a5fcea580d8652"} Oct 01 12:00:01 crc kubenswrapper[4669]: I1001 12:00:01.443956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" event={"ID":"51feb999-df70-4811-a60f-ae7968fbd9d1","Type":"ContainerStarted","Data":"338298c2e8a884c66e3925607798e6272319147d6d7afafd706f3c44d35d6a74"} Oct 01 12:00:01 crc kubenswrapper[4669]: I1001 12:00:01.472351 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" podStartSLOduration=1.47231568 podStartE2EDuration="1.47231568s" podCreationTimestamp="2025-10-01 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:00:01.463851941 +0000 UTC m=+1892.563416918" watchObservedRunningTime="2025-10-01 12:00:01.47231568 +0000 UTC m=+1892.571880657" Oct 01 12:00:01 crc kubenswrapper[4669]: I1001 12:00:01.645547 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:00:01 crc kubenswrapper[4669]: E1001 12:00:01.645759 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:00:02 crc kubenswrapper[4669]: I1001 12:00:02.459108 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" event={"ID":"51feb999-df70-4811-a60f-ae7968fbd9d1","Type":"ContainerDied","Data":"d226940d390160316228fc7ac772f82db34dd49183ee5399d0a5fcea580d8652"} Oct 01 12:00:02 crc kubenswrapper[4669]: I1001 12:00:02.458837 4669 generic.go:334] "Generic (PLEG): container finished" podID="51feb999-df70-4811-a60f-ae7968fbd9d1" containerID="d226940d390160316228fc7ac772f82db34dd49183ee5399d0a5fcea580d8652" exitCode=0 Oct 01 12:00:03 crc kubenswrapper[4669]: I1001 12:00:03.926681 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.031745 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl27r\" (UniqueName: \"kubernetes.io/projected/51feb999-df70-4811-a60f-ae7968fbd9d1-kube-api-access-sl27r\") pod \"51feb999-df70-4811-a60f-ae7968fbd9d1\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.031939 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51feb999-df70-4811-a60f-ae7968fbd9d1-secret-volume\") pod \"51feb999-df70-4811-a60f-ae7968fbd9d1\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.032232 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51feb999-df70-4811-a60f-ae7968fbd9d1-config-volume\") pod \"51feb999-df70-4811-a60f-ae7968fbd9d1\" (UID: \"51feb999-df70-4811-a60f-ae7968fbd9d1\") " Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.033128 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51feb999-df70-4811-a60f-ae7968fbd9d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "51feb999-df70-4811-a60f-ae7968fbd9d1" (UID: "51feb999-df70-4811-a60f-ae7968fbd9d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.040346 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51feb999-df70-4811-a60f-ae7968fbd9d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "51feb999-df70-4811-a60f-ae7968fbd9d1" (UID: "51feb999-df70-4811-a60f-ae7968fbd9d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.040912 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51feb999-df70-4811-a60f-ae7968fbd9d1-kube-api-access-sl27r" (OuterVolumeSpecName: "kube-api-access-sl27r") pod "51feb999-df70-4811-a60f-ae7968fbd9d1" (UID: "51feb999-df70-4811-a60f-ae7968fbd9d1"). InnerVolumeSpecName "kube-api-access-sl27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.135136 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51feb999-df70-4811-a60f-ae7968fbd9d1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.135202 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51feb999-df70-4811-a60f-ae7968fbd9d1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.135231 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl27r\" (UniqueName: \"kubernetes.io/projected/51feb999-df70-4811-a60f-ae7968fbd9d1-kube-api-access-sl27r\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.485679 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" event={"ID":"51feb999-df70-4811-a60f-ae7968fbd9d1","Type":"ContainerDied","Data":"338298c2e8a884c66e3925607798e6272319147d6d7afafd706f3c44d35d6a74"} Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.485748 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="338298c2e8a884c66e3925607798e6272319147d6d7afafd706f3c44d35d6a74" Oct 01 12:00:04 crc kubenswrapper[4669]: I1001 12:00:04.485846 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784" Oct 01 12:00:12 crc kubenswrapper[4669]: E1001 12:00:12.800162 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbee90766_2c6f_4f88_a17d_33098d6599a9.slice/crio-conmon-3da958d66115b7ace9d83211899f3fa5497acaa55579f04403d14b7b41355498.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbee90766_2c6f_4f88_a17d_33098d6599a9.slice/crio-3da958d66115b7ace9d83211899f3fa5497acaa55579f04403d14b7b41355498.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:00:13 crc kubenswrapper[4669]: I1001 12:00:13.580829 4669 generic.go:334] "Generic (PLEG): container finished" podID="bee90766-2c6f-4f88-a17d-33098d6599a9" containerID="3da958d66115b7ace9d83211899f3fa5497acaa55579f04403d14b7b41355498" exitCode=0 Oct 01 12:00:13 crc kubenswrapper[4669]: I1001 12:00:13.580900 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" event={"ID":"bee90766-2c6f-4f88-a17d-33098d6599a9","Type":"ContainerDied","Data":"3da958d66115b7ace9d83211899f3fa5497acaa55579f04403d14b7b41355498"} Oct 01 12:00:14 crc kubenswrapper[4669]: I1001 12:00:14.644408 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:00:14 crc kubenswrapper[4669]: E1001 12:00:14.644961 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.146811 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.216685 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7mjl\" (UniqueName: \"kubernetes.io/projected/bee90766-2c6f-4f88-a17d-33098d6599a9-kube-api-access-p7mjl\") pod \"bee90766-2c6f-4f88-a17d-33098d6599a9\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.216831 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-inventory\") pod \"bee90766-2c6f-4f88-a17d-33098d6599a9\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.216976 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-ssh-key\") pod \"bee90766-2c6f-4f88-a17d-33098d6599a9\" (UID: \"bee90766-2c6f-4f88-a17d-33098d6599a9\") " Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.224481 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee90766-2c6f-4f88-a17d-33098d6599a9-kube-api-access-p7mjl" (OuterVolumeSpecName: "kube-api-access-p7mjl") pod "bee90766-2c6f-4f88-a17d-33098d6599a9" (UID: "bee90766-2c6f-4f88-a17d-33098d6599a9"). InnerVolumeSpecName "kube-api-access-p7mjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.255732 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-inventory" (OuterVolumeSpecName: "inventory") pod "bee90766-2c6f-4f88-a17d-33098d6599a9" (UID: "bee90766-2c6f-4f88-a17d-33098d6599a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.263492 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bee90766-2c6f-4f88-a17d-33098d6599a9" (UID: "bee90766-2c6f-4f88-a17d-33098d6599a9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.319623 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.319683 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7mjl\" (UniqueName: \"kubernetes.io/projected/bee90766-2c6f-4f88-a17d-33098d6599a9-kube-api-access-p7mjl\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.319710 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bee90766-2c6f-4f88-a17d-33098d6599a9-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.607329 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" event={"ID":"bee90766-2c6f-4f88-a17d-33098d6599a9","Type":"ContainerDied","Data":"78dc8ce1e444835d394341b9fd0fd20319ac9c3c3e1d61ba26b305e47f0acd2d"} Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.607372 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78dc8ce1e444835d394341b9fd0fd20319ac9c3c3e1d61ba26b305e47f0acd2d" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.607416 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m55t4" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.734376 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-l66p2"] Oct 01 12:00:15 crc kubenswrapper[4669]: E1001 12:00:15.734878 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51feb999-df70-4811-a60f-ae7968fbd9d1" containerName="collect-profiles" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.734892 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="51feb999-df70-4811-a60f-ae7968fbd9d1" containerName="collect-profiles" Oct 01 12:00:15 crc kubenswrapper[4669]: E1001 12:00:15.734906 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee90766-2c6f-4f88-a17d-33098d6599a9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.734914 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee90766-2c6f-4f88-a17d-33098d6599a9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.735136 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee90766-2c6f-4f88-a17d-33098d6599a9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.735157 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="51feb999-df70-4811-a60f-ae7968fbd9d1" containerName="collect-profiles" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.735838 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.738870 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.739528 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.739855 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.740160 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.744328 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-l66p2"] Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.833325 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdvgh\" (UniqueName: \"kubernetes.io/projected/7c88952b-368f-4527-8916-b4877e5af1e3-kube-api-access-jdvgh\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.833695 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.833881 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.936338 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdvgh\" (UniqueName: \"kubernetes.io/projected/7c88952b-368f-4527-8916-b4877e5af1e3-kube-api-access-jdvgh\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.936512 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.936552 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.952240 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.952311 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:15 crc kubenswrapper[4669]: I1001 12:00:15.969953 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdvgh\" (UniqueName: \"kubernetes.io/projected/7c88952b-368f-4527-8916-b4877e5af1e3-kube-api-access-jdvgh\") pod \"ssh-known-hosts-edpm-deployment-l66p2\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:16 crc kubenswrapper[4669]: I1001 12:00:16.074493 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:16 crc kubenswrapper[4669]: I1001 12:00:16.637553 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-l66p2"] Oct 01 12:00:17 crc kubenswrapper[4669]: I1001 12:00:17.630747 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" event={"ID":"7c88952b-368f-4527-8916-b4877e5af1e3","Type":"ContainerStarted","Data":"7a43ba79de61455b1cba9e27b8c059f53f9f1045d17df7961f2a49cd7514fcf9"} Oct 01 12:00:17 crc kubenswrapper[4669]: I1001 12:00:17.634595 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" event={"ID":"7c88952b-368f-4527-8916-b4877e5af1e3","Type":"ContainerStarted","Data":"ab26f0f0bb0fe691b4db47df8df5d953923f679e1424e2c761d696b1d36ea64f"} Oct 01 12:00:17 crc kubenswrapper[4669]: I1001 12:00:17.651434 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" podStartSLOduration=2.192543272 podStartE2EDuration="2.65141758s" podCreationTimestamp="2025-10-01 12:00:15 +0000 UTC" firstStartedPulling="2025-10-01 12:00:16.639375707 +0000 UTC m=+1907.738940714" lastFinishedPulling="2025-10-01 12:00:17.098250045 +0000 UTC m=+1908.197815022" observedRunningTime="2025-10-01 12:00:17.651330038 +0000 UTC m=+1908.750895015" watchObservedRunningTime="2025-10-01 12:00:17.65141758 +0000 UTC m=+1908.750982557" Oct 01 12:00:25 crc kubenswrapper[4669]: I1001 12:00:25.645026 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:00:25 crc kubenswrapper[4669]: E1001 12:00:25.646347 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:00:25 crc kubenswrapper[4669]: I1001 12:00:25.746567 4669 generic.go:334] "Generic (PLEG): container finished" podID="7c88952b-368f-4527-8916-b4877e5af1e3" containerID="7a43ba79de61455b1cba9e27b8c059f53f9f1045d17df7961f2a49cd7514fcf9" exitCode=0 Oct 01 12:00:25 crc kubenswrapper[4669]: I1001 12:00:25.746618 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" event={"ID":"7c88952b-368f-4527-8916-b4877e5af1e3","Type":"ContainerDied","Data":"7a43ba79de61455b1cba9e27b8c059f53f9f1045d17df7961f2a49cd7514fcf9"} Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.183014 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.284884 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-ssh-key-openstack-edpm-ipam\") pod \"7c88952b-368f-4527-8916-b4877e5af1e3\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.285006 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-inventory-0\") pod \"7c88952b-368f-4527-8916-b4877e5af1e3\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.285141 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdvgh\" (UniqueName: \"kubernetes.io/projected/7c88952b-368f-4527-8916-b4877e5af1e3-kube-api-access-jdvgh\") pod \"7c88952b-368f-4527-8916-b4877e5af1e3\" (UID: \"7c88952b-368f-4527-8916-b4877e5af1e3\") " Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.292284 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c88952b-368f-4527-8916-b4877e5af1e3-kube-api-access-jdvgh" (OuterVolumeSpecName: "kube-api-access-jdvgh") pod "7c88952b-368f-4527-8916-b4877e5af1e3" (UID: "7c88952b-368f-4527-8916-b4877e5af1e3"). InnerVolumeSpecName "kube-api-access-jdvgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.317806 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7c88952b-368f-4527-8916-b4877e5af1e3" (UID: "7c88952b-368f-4527-8916-b4877e5af1e3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.320420 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7c88952b-368f-4527-8916-b4877e5af1e3" (UID: "7c88952b-368f-4527-8916-b4877e5af1e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.388254 4669 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.388298 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdvgh\" (UniqueName: \"kubernetes.io/projected/7c88952b-368f-4527-8916-b4877e5af1e3-kube-api-access-jdvgh\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.388319 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88952b-368f-4527-8916-b4877e5af1e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.769192 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" event={"ID":"7c88952b-368f-4527-8916-b4877e5af1e3","Type":"ContainerDied","Data":"ab26f0f0bb0fe691b4db47df8df5d953923f679e1424e2c761d696b1d36ea64f"} Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.769273 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab26f0f0bb0fe691b4db47df8df5d953923f679e1424e2c761d696b1d36ea64f" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.769738 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l66p2" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.869512 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj"] Oct 01 12:00:27 crc kubenswrapper[4669]: E1001 12:00:27.870667 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c88952b-368f-4527-8916-b4877e5af1e3" containerName="ssh-known-hosts-edpm-deployment" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.870685 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c88952b-368f-4527-8916-b4877e5af1e3" containerName="ssh-known-hosts-edpm-deployment" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.871105 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c88952b-368f-4527-8916-b4877e5af1e3" containerName="ssh-known-hosts-edpm-deployment" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.872308 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.881888 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.882481 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.882626 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.881881 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:00:27 crc kubenswrapper[4669]: I1001 12:00:27.886121 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj"] Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.002345 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.002424 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.002503 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r574m\" (UniqueName: \"kubernetes.io/projected/0ffd3326-9422-4f07-b3e1-857324cff3e2-kube-api-access-r574m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.105962 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.106042 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.106133 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r574m\" (UniqueName: \"kubernetes.io/projected/0ffd3326-9422-4f07-b3e1-857324cff3e2-kube-api-access-r574m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.111694 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.111925 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.124330 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r574m\" (UniqueName: \"kubernetes.io/projected/0ffd3326-9422-4f07-b3e1-857324cff3e2-kube-api-access-r574m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c4phj\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.208052 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:28 crc kubenswrapper[4669]: I1001 12:00:28.794959 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj"] Oct 01 12:00:29 crc kubenswrapper[4669]: I1001 12:00:29.787961 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" event={"ID":"0ffd3326-9422-4f07-b3e1-857324cff3e2","Type":"ContainerStarted","Data":"7e2fdb58e3e358862082ca0c5d519af5bbef3087f2104d2413de4f8f8194bc7e"} Oct 01 12:00:29 crc kubenswrapper[4669]: I1001 12:00:29.788564 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" event={"ID":"0ffd3326-9422-4f07-b3e1-857324cff3e2","Type":"ContainerStarted","Data":"3f5a7eaef8ad482d60f8538ca3700f8181b30dfd8f606c3164a627fdaabd801f"} Oct 01 12:00:29 crc kubenswrapper[4669]: I1001 12:00:29.804957 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" podStartSLOduration=2.349584632 podStartE2EDuration="2.804932533s" podCreationTimestamp="2025-10-01 12:00:27 +0000 UTC" firstStartedPulling="2025-10-01 12:00:28.800719813 +0000 UTC m=+1919.900284790" lastFinishedPulling="2025-10-01 12:00:29.256067674 +0000 UTC m=+1920.355632691" observedRunningTime="2025-10-01 12:00:29.803445447 +0000 UTC m=+1920.903010434" watchObservedRunningTime="2025-10-01 12:00:29.804932533 +0000 UTC m=+1920.904497510" Oct 01 12:00:35 crc kubenswrapper[4669]: I1001 12:00:35.728332 4669 scope.go:117] "RemoveContainer" containerID="a2f60114ec41fdb01b3357e0512a02a1c2c9e385743036f267c90540359914d3" Oct 01 12:00:35 crc kubenswrapper[4669]: I1001 12:00:35.746966 4669 scope.go:117] "RemoveContainer" containerID="01b721d8d539721d82aa0bd6b17be4377c69f05b958b72193e21244186b513bd" Oct 01 12:00:35 crc kubenswrapper[4669]: I1001 12:00:35.762638 4669 scope.go:117] "RemoveContainer" containerID="e271d0df5ec2a80af27a8ecea54d84bffe108633b9b138badc58c2b988af0f10" Oct 01 12:00:38 crc kubenswrapper[4669]: I1001 12:00:38.896020 4669 generic.go:334] "Generic (PLEG): container finished" podID="0ffd3326-9422-4f07-b3e1-857324cff3e2" containerID="7e2fdb58e3e358862082ca0c5d519af5bbef3087f2104d2413de4f8f8194bc7e" exitCode=0 Oct 01 12:00:38 crc kubenswrapper[4669]: I1001 12:00:38.896130 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" event={"ID":"0ffd3326-9422-4f07-b3e1-857324cff3e2","Type":"ContainerDied","Data":"7e2fdb58e3e358862082ca0c5d519af5bbef3087f2104d2413de4f8f8194bc7e"} Oct 01 12:00:39 crc kubenswrapper[4669]: I1001 12:00:39.653393 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:00:39 crc kubenswrapper[4669]: E1001 12:00:39.653657 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.435881 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.604127 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-inventory\") pod \"0ffd3326-9422-4f07-b3e1-857324cff3e2\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.604451 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-ssh-key\") pod \"0ffd3326-9422-4f07-b3e1-857324cff3e2\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.604616 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r574m\" (UniqueName: \"kubernetes.io/projected/0ffd3326-9422-4f07-b3e1-857324cff3e2-kube-api-access-r574m\") pod \"0ffd3326-9422-4f07-b3e1-857324cff3e2\" (UID: \"0ffd3326-9422-4f07-b3e1-857324cff3e2\") " Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.611458 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffd3326-9422-4f07-b3e1-857324cff3e2-kube-api-access-r574m" (OuterVolumeSpecName: "kube-api-access-r574m") pod "0ffd3326-9422-4f07-b3e1-857324cff3e2" (UID: "0ffd3326-9422-4f07-b3e1-857324cff3e2"). InnerVolumeSpecName "kube-api-access-r574m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.637306 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ffd3326-9422-4f07-b3e1-857324cff3e2" (UID: "0ffd3326-9422-4f07-b3e1-857324cff3e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.659429 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-inventory" (OuterVolumeSpecName: "inventory") pod "0ffd3326-9422-4f07-b3e1-857324cff3e2" (UID: "0ffd3326-9422-4f07-b3e1-857324cff3e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.707811 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.707864 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r574m\" (UniqueName: \"kubernetes.io/projected/0ffd3326-9422-4f07-b3e1-857324cff3e2-kube-api-access-r574m\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.707879 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffd3326-9422-4f07-b3e1-857324cff3e2-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.922160 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" event={"ID":"0ffd3326-9422-4f07-b3e1-857324cff3e2","Type":"ContainerDied","Data":"3f5a7eaef8ad482d60f8538ca3700f8181b30dfd8f606c3164a627fdaabd801f"} Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.922207 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f5a7eaef8ad482d60f8538ca3700f8181b30dfd8f606c3164a627fdaabd801f" Oct 01 12:00:40 crc kubenswrapper[4669]: I1001 12:00:40.922252 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c4phj" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.028245 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf"] Oct 01 12:00:41 crc kubenswrapper[4669]: E1001 12:00:41.029487 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffd3326-9422-4f07-b3e1-857324cff3e2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.029517 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffd3326-9422-4f07-b3e1-857324cff3e2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.030018 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffd3326-9422-4f07-b3e1-857324cff3e2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.031173 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.036937 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.036956 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.036996 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.036941 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.038656 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf"] Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.220432 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.220591 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.220911 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c945r\" (UniqueName: \"kubernetes.io/projected/266686ce-e77a-4c6f-83d3-4d417e9a819f-kube-api-access-c945r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.325276 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.326754 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c945r\" (UniqueName: \"kubernetes.io/projected/266686ce-e77a-4c6f-83d3-4d417e9a819f-kube-api-access-c945r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.327267 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.333944 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.340903 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.355936 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c945r\" (UniqueName: \"kubernetes.io/projected/266686ce-e77a-4c6f-83d3-4d417e9a819f-kube-api-access-c945r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.361061 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:41 crc kubenswrapper[4669]: I1001 12:00:41.973862 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf"] Oct 01 12:00:42 crc kubenswrapper[4669]: I1001 12:00:42.948678 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" event={"ID":"266686ce-e77a-4c6f-83d3-4d417e9a819f","Type":"ContainerStarted","Data":"41da6697d914587f5b44a659f46a49690e87a4b90e6fd4e063ce651ae66e475b"} Oct 01 12:00:42 crc kubenswrapper[4669]: I1001 12:00:42.949561 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" event={"ID":"266686ce-e77a-4c6f-83d3-4d417e9a819f","Type":"ContainerStarted","Data":"e22e7674bd5be7aea57bb030af62a0582dec75a25033c361a09dd85d6cbe2e7e"} Oct 01 12:00:42 crc kubenswrapper[4669]: I1001 12:00:42.982273 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" podStartSLOduration=2.564879067 podStartE2EDuration="2.98223672s" podCreationTimestamp="2025-10-01 12:00:40 +0000 UTC" firstStartedPulling="2025-10-01 12:00:41.982762057 +0000 UTC m=+1933.082327034" lastFinishedPulling="2025-10-01 12:00:42.4001197 +0000 UTC m=+1933.499684687" observedRunningTime="2025-10-01 12:00:42.971331 +0000 UTC m=+1934.070895977" watchObservedRunningTime="2025-10-01 12:00:42.98223672 +0000 UTC m=+1934.081801737" Oct 01 12:00:54 crc kubenswrapper[4669]: I1001 12:00:54.073919 4669 generic.go:334] "Generic (PLEG): container finished" podID="266686ce-e77a-4c6f-83d3-4d417e9a819f" containerID="41da6697d914587f5b44a659f46a49690e87a4b90e6fd4e063ce651ae66e475b" exitCode=0 Oct 01 12:00:54 crc kubenswrapper[4669]: I1001 12:00:54.074144 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" event={"ID":"266686ce-e77a-4c6f-83d3-4d417e9a819f","Type":"ContainerDied","Data":"41da6697d914587f5b44a659f46a49690e87a4b90e6fd4e063ce651ae66e475b"} Oct 01 12:00:54 crc kubenswrapper[4669]: I1001 12:00:54.644293 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:00:54 crc kubenswrapper[4669]: E1001 12:00:54.644784 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.526375 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.652022 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-inventory\") pod \"266686ce-e77a-4c6f-83d3-4d417e9a819f\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.652105 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c945r\" (UniqueName: \"kubernetes.io/projected/266686ce-e77a-4c6f-83d3-4d417e9a819f-kube-api-access-c945r\") pod \"266686ce-e77a-4c6f-83d3-4d417e9a819f\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.652129 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-ssh-key\") pod \"266686ce-e77a-4c6f-83d3-4d417e9a819f\" (UID: \"266686ce-e77a-4c6f-83d3-4d417e9a819f\") " Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.659781 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/266686ce-e77a-4c6f-83d3-4d417e9a819f-kube-api-access-c945r" (OuterVolumeSpecName: "kube-api-access-c945r") pod "266686ce-e77a-4c6f-83d3-4d417e9a819f" (UID: "266686ce-e77a-4c6f-83d3-4d417e9a819f"). InnerVolumeSpecName "kube-api-access-c945r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.683692 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "266686ce-e77a-4c6f-83d3-4d417e9a819f" (UID: "266686ce-e77a-4c6f-83d3-4d417e9a819f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.697126 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-inventory" (OuterVolumeSpecName: "inventory") pod "266686ce-e77a-4c6f-83d3-4d417e9a819f" (UID: "266686ce-e77a-4c6f-83d3-4d417e9a819f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.756087 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c945r\" (UniqueName: \"kubernetes.io/projected/266686ce-e77a-4c6f-83d3-4d417e9a819f-kube-api-access-c945r\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.756121 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:55 crc kubenswrapper[4669]: I1001 12:00:55.756130 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/266686ce-e77a-4c6f-83d3-4d417e9a819f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.104246 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" event={"ID":"266686ce-e77a-4c6f-83d3-4d417e9a819f","Type":"ContainerDied","Data":"e22e7674bd5be7aea57bb030af62a0582dec75a25033c361a09dd85d6cbe2e7e"} Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.104605 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22e7674bd5be7aea57bb030af62a0582dec75a25033c361a09dd85d6cbe2e7e" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.104318 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.223858 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2"] Oct 01 12:00:56 crc kubenswrapper[4669]: E1001 12:00:56.224642 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266686ce-e77a-4c6f-83d3-4d417e9a819f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.224760 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="266686ce-e77a-4c6f-83d3-4d417e9a819f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.225110 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="266686ce-e77a-4c6f-83d3-4d417e9a819f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.231328 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.241828 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.242311 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.244031 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.244510 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.244605 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.245370 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.245656 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.245959 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.246556 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2"] Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.373827 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.373952 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.374019 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.374302 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ngp5\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-kube-api-access-5ngp5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.374546 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.374744 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.374793 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.374940 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.375199 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.375263 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.375564 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.375752 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.375920 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.376007 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478367 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478491 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478598 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478664 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478732 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478775 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478841 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478891 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478927 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.478975 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ngp5\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-kube-api-access-5ngp5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.479050 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.479187 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.479235 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.479293 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.485117 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.485278 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.490302 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.490530 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.492188 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.492980 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.493002 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.495034 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.495231 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.497006 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.500244 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.500490 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.501127 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.505859 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ngp5\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-kube-api-access-5ngp5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:56 crc kubenswrapper[4669]: I1001 12:00:56.571928 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:00:57 crc kubenswrapper[4669]: I1001 12:00:57.197666 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2"] Oct 01 12:00:57 crc kubenswrapper[4669]: W1001 12:00:57.206119 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb0c4afd_aaf3_4875_94ec_668841ba1127.slice/crio-a5b564a659013bcfbfa4b16fa94b890f647f0c3bc196dbc583224f4996ed32fe WatchSource:0}: Error finding container a5b564a659013bcfbfa4b16fa94b890f647f0c3bc196dbc583224f4996ed32fe: Status 404 returned error can't find the container with id a5b564a659013bcfbfa4b16fa94b890f647f0c3bc196dbc583224f4996ed32fe Oct 01 12:00:57 crc kubenswrapper[4669]: I1001 12:00:57.212768 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:00:58 crc kubenswrapper[4669]: I1001 12:00:58.142359 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" event={"ID":"bb0c4afd-aaf3-4875-94ec-668841ba1127","Type":"ContainerStarted","Data":"70f9c6ba7144881b5bcdbd2f8f27eb3924511b3ebb6c6c31dd0440d4d45a4b8a"} Oct 01 12:00:58 crc kubenswrapper[4669]: I1001 12:00:58.143718 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" event={"ID":"bb0c4afd-aaf3-4875-94ec-668841ba1127","Type":"ContainerStarted","Data":"a5b564a659013bcfbfa4b16fa94b890f647f0c3bc196dbc583224f4996ed32fe"} Oct 01 12:00:58 crc kubenswrapper[4669]: I1001 12:00:58.161828 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" podStartSLOduration=1.699565639 podStartE2EDuration="2.161805366s" podCreationTimestamp="2025-10-01 12:00:56 +0000 UTC" firstStartedPulling="2025-10-01 12:00:57.212449357 +0000 UTC m=+1948.312014334" lastFinishedPulling="2025-10-01 12:00:57.674689084 +0000 UTC m=+1948.774254061" observedRunningTime="2025-10-01 12:00:58.158137685 +0000 UTC m=+1949.257702662" watchObservedRunningTime="2025-10-01 12:00:58.161805366 +0000 UTC m=+1949.261370353" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.135651 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322001-ljw4f"] Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.138027 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.184839 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322001-ljw4f"] Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.279927 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-combined-ca-bundle\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.280534 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj7fz\" (UniqueName: \"kubernetes.io/projected/6de4821a-ded1-483f-ade1-dda52ecc46ed-kube-api-access-tj7fz\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.280976 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-config-data\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.281134 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-fernet-keys\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.384421 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-config-data\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.384536 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-fernet-keys\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.384646 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-combined-ca-bundle\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.384692 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj7fz\" (UniqueName: \"kubernetes.io/projected/6de4821a-ded1-483f-ade1-dda52ecc46ed-kube-api-access-tj7fz\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.394286 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-combined-ca-bundle\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.396255 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-fernet-keys\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.401124 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-config-data\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.409926 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj7fz\" (UniqueName: \"kubernetes.io/projected/6de4821a-ded1-483f-ade1-dda52ecc46ed-kube-api-access-tj7fz\") pod \"keystone-cron-29322001-ljw4f\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:00 crc kubenswrapper[4669]: I1001 12:01:00.495291 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:01 crc kubenswrapper[4669]: I1001 12:01:01.026986 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322001-ljw4f"] Oct 01 12:01:01 crc kubenswrapper[4669]: W1001 12:01:01.035729 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de4821a_ded1_483f_ade1_dda52ecc46ed.slice/crio-d8948c26e531dfa77a0285531797fd6c53d4d14e7a010c130c11328c2a547da5 WatchSource:0}: Error finding container d8948c26e531dfa77a0285531797fd6c53d4d14e7a010c130c11328c2a547da5: Status 404 returned error can't find the container with id d8948c26e531dfa77a0285531797fd6c53d4d14e7a010c130c11328c2a547da5 Oct 01 12:01:01 crc kubenswrapper[4669]: I1001 12:01:01.199908 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322001-ljw4f" event={"ID":"6de4821a-ded1-483f-ade1-dda52ecc46ed","Type":"ContainerStarted","Data":"d8948c26e531dfa77a0285531797fd6c53d4d14e7a010c130c11328c2a547da5"} Oct 01 12:01:02 crc kubenswrapper[4669]: I1001 12:01:02.211326 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322001-ljw4f" event={"ID":"6de4821a-ded1-483f-ade1-dda52ecc46ed","Type":"ContainerStarted","Data":"0fdaf0927d074146dd8d12fe848125e42fde80aad0a8cfaa434020b712b4fd35"} Oct 01 12:01:02 crc kubenswrapper[4669]: I1001 12:01:02.230974 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322001-ljw4f" podStartSLOduration=2.23092948 podStartE2EDuration="2.23092948s" podCreationTimestamp="2025-10-01 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:01:02.228223243 +0000 UTC m=+1953.327788260" watchObservedRunningTime="2025-10-01 12:01:02.23092948 +0000 UTC m=+1953.330494467" Oct 01 12:01:03 crc kubenswrapper[4669]: I1001 12:01:03.227521 4669 generic.go:334] "Generic (PLEG): container finished" podID="6de4821a-ded1-483f-ade1-dda52ecc46ed" containerID="0fdaf0927d074146dd8d12fe848125e42fde80aad0a8cfaa434020b712b4fd35" exitCode=0 Oct 01 12:01:03 crc kubenswrapper[4669]: I1001 12:01:03.227576 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322001-ljw4f" event={"ID":"6de4821a-ded1-483f-ade1-dda52ecc46ed","Type":"ContainerDied","Data":"0fdaf0927d074146dd8d12fe848125e42fde80aad0a8cfaa434020b712b4fd35"} Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.651150 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.717595 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj7fz\" (UniqueName: \"kubernetes.io/projected/6de4821a-ded1-483f-ade1-dda52ecc46ed-kube-api-access-tj7fz\") pod \"6de4821a-ded1-483f-ade1-dda52ecc46ed\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.717710 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-config-data\") pod \"6de4821a-ded1-483f-ade1-dda52ecc46ed\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.717942 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-combined-ca-bundle\") pod \"6de4821a-ded1-483f-ade1-dda52ecc46ed\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.718199 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-fernet-keys\") pod \"6de4821a-ded1-483f-ade1-dda52ecc46ed\" (UID: \"6de4821a-ded1-483f-ade1-dda52ecc46ed\") " Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.732852 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de4821a-ded1-483f-ade1-dda52ecc46ed-kube-api-access-tj7fz" (OuterVolumeSpecName: "kube-api-access-tj7fz") pod "6de4821a-ded1-483f-ade1-dda52ecc46ed" (UID: "6de4821a-ded1-483f-ade1-dda52ecc46ed"). InnerVolumeSpecName "kube-api-access-tj7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.735708 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6de4821a-ded1-483f-ade1-dda52ecc46ed" (UID: "6de4821a-ded1-483f-ade1-dda52ecc46ed"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.763930 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6de4821a-ded1-483f-ade1-dda52ecc46ed" (UID: "6de4821a-ded1-483f-ade1-dda52ecc46ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.794307 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-config-data" (OuterVolumeSpecName: "config-data") pod "6de4821a-ded1-483f-ade1-dda52ecc46ed" (UID: "6de4821a-ded1-483f-ade1-dda52ecc46ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.820221 4669 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.820255 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj7fz\" (UniqueName: \"kubernetes.io/projected/6de4821a-ded1-483f-ade1-dda52ecc46ed-kube-api-access-tj7fz\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.820270 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:04 crc kubenswrapper[4669]: I1001 12:01:04.820280 4669 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6de4821a-ded1-483f-ade1-dda52ecc46ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:05 crc kubenswrapper[4669]: I1001 12:01:05.257773 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322001-ljw4f" event={"ID":"6de4821a-ded1-483f-ade1-dda52ecc46ed","Type":"ContainerDied","Data":"d8948c26e531dfa77a0285531797fd6c53d4d14e7a010c130c11328c2a547da5"} Oct 01 12:01:05 crc kubenswrapper[4669]: I1001 12:01:05.257832 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8948c26e531dfa77a0285531797fd6c53d4d14e7a010c130c11328c2a547da5" Oct 01 12:01:05 crc kubenswrapper[4669]: I1001 12:01:05.257902 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322001-ljw4f" Oct 01 12:01:05 crc kubenswrapper[4669]: I1001 12:01:05.644197 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:01:05 crc kubenswrapper[4669]: E1001 12:01:05.644640 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:01:20 crc kubenswrapper[4669]: I1001 12:01:20.644681 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:01:20 crc kubenswrapper[4669]: E1001 12:01:20.646006 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:01:32 crc kubenswrapper[4669]: I1001 12:01:32.644662 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:01:33 crc kubenswrapper[4669]: I1001 12:01:33.566046 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"2127b7b66421d1ad91fee0262eada6fd44bf5302e446225cc270993aced2bd3f"} Oct 01 12:01:41 crc kubenswrapper[4669]: I1001 12:01:41.659368 4669 generic.go:334] "Generic (PLEG): container finished" podID="bb0c4afd-aaf3-4875-94ec-668841ba1127" containerID="70f9c6ba7144881b5bcdbd2f8f27eb3924511b3ebb6c6c31dd0440d4d45a4b8a" exitCode=0 Oct 01 12:01:41 crc kubenswrapper[4669]: I1001 12:01:41.659437 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" event={"ID":"bb0c4afd-aaf3-4875-94ec-668841ba1127","Type":"ContainerDied","Data":"70f9c6ba7144881b5bcdbd2f8f27eb3924511b3ebb6c6c31dd0440d4d45a4b8a"} Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.157579 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.283985 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.284362 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ngp5\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-kube-api-access-5ngp5\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.284510 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ssh-key\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.284667 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ovn-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.284788 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.284906 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-nova-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285022 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-telemetry-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285124 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-neutron-metadata-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285258 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-inventory\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285389 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-repo-setup-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285533 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-bootstrap-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285671 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-ovn-default-certs-0\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285784 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-libvirt-combined-ca-bundle\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.285918 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"bb0c4afd-aaf3-4875-94ec-668841ba1127\" (UID: \"bb0c4afd-aaf3-4875-94ec-668841ba1127\") " Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.292395 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.292438 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.292469 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.293206 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.293903 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.293987 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.295556 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.296710 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.297722 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.297896 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-kube-api-access-5ngp5" (OuterVolumeSpecName: "kube-api-access-5ngp5") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "kube-api-access-5ngp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.298538 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.298985 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.321546 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.327936 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-inventory" (OuterVolumeSpecName: "inventory") pod "bb0c4afd-aaf3-4875-94ec-668841ba1127" (UID: "bb0c4afd-aaf3-4875-94ec-668841ba1127"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389036 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389069 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389162 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389173 4669 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389186 4669 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389198 4669 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389218 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389230 4669 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389241 4669 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389254 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389265 4669 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0c4afd-aaf3-4875-94ec-668841ba1127-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389278 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389290 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.389302 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ngp5\" (UniqueName: \"kubernetes.io/projected/bb0c4afd-aaf3-4875-94ec-668841ba1127-kube-api-access-5ngp5\") on node \"crc\" DevicePath \"\"" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.731969 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" event={"ID":"bb0c4afd-aaf3-4875-94ec-668841ba1127","Type":"ContainerDied","Data":"a5b564a659013bcfbfa4b16fa94b890f647f0c3bc196dbc583224f4996ed32fe"} Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.732157 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5b564a659013bcfbfa4b16fa94b890f647f0c3bc196dbc583224f4996ed32fe" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.732333 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.858809 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv"] Oct 01 12:01:43 crc kubenswrapper[4669]: E1001 12:01:43.859376 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de4821a-ded1-483f-ade1-dda52ecc46ed" containerName="keystone-cron" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.859398 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de4821a-ded1-483f-ade1-dda52ecc46ed" containerName="keystone-cron" Oct 01 12:01:43 crc kubenswrapper[4669]: E1001 12:01:43.859447 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0c4afd-aaf3-4875-94ec-668841ba1127" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.859457 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0c4afd-aaf3-4875-94ec-668841ba1127" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.859690 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de4821a-ded1-483f-ade1-dda52ecc46ed" containerName="keystone-cron" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.859725 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0c4afd-aaf3-4875-94ec-668841ba1127" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.860603 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.863860 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.864344 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.867848 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.867993 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.869561 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.879869 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv"] Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.902353 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.902459 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.902577 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.902615 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw4dq\" (UniqueName: \"kubernetes.io/projected/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-kube-api-access-qw4dq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:43 crc kubenswrapper[4669]: I1001 12:01:43.902656 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.004876 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw4dq\" (UniqueName: \"kubernetes.io/projected/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-kube-api-access-qw4dq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.004984 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.005048 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.005169 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.005351 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.006382 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.009977 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.009971 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.016439 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.024914 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw4dq\" (UniqueName: \"kubernetes.io/projected/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-kube-api-access-qw4dq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xmvtv\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.182656 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:01:44 crc kubenswrapper[4669]: I1001 12:01:44.762388 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv"] Oct 01 12:01:45 crc kubenswrapper[4669]: I1001 12:01:45.752413 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" event={"ID":"ffe0bf53-0bbb-45ac-96b3-fa31c365470c","Type":"ContainerStarted","Data":"5de5c6358ad2fc22fe93ba8b11eee513c70e38c981497d5609aad5d6347b3a52"} Oct 01 12:01:46 crc kubenswrapper[4669]: I1001 12:01:46.764417 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" event={"ID":"ffe0bf53-0bbb-45ac-96b3-fa31c365470c","Type":"ContainerStarted","Data":"ecd5b79df47af79b59ea82915ba36311abddb4f519a85b54ab8eeeed605b953f"} Oct 01 12:01:46 crc kubenswrapper[4669]: I1001 12:01:46.800782 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" podStartSLOduration=3.081906479 podStartE2EDuration="3.800759574s" podCreationTimestamp="2025-10-01 12:01:43 +0000 UTC" firstStartedPulling="2025-10-01 12:01:44.764114211 +0000 UTC m=+1995.863679188" lastFinishedPulling="2025-10-01 12:01:45.482967306 +0000 UTC m=+1996.582532283" observedRunningTime="2025-10-01 12:01:46.792244224 +0000 UTC m=+1997.891809201" watchObservedRunningTime="2025-10-01 12:01:46.800759574 +0000 UTC m=+1997.900324551" Oct 01 12:02:57 crc kubenswrapper[4669]: I1001 12:02:57.634945 4669 generic.go:334] "Generic (PLEG): container finished" podID="ffe0bf53-0bbb-45ac-96b3-fa31c365470c" containerID="ecd5b79df47af79b59ea82915ba36311abddb4f519a85b54ab8eeeed605b953f" exitCode=0 Oct 01 12:02:57 crc kubenswrapper[4669]: I1001 12:02:57.635014 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" event={"ID":"ffe0bf53-0bbb-45ac-96b3-fa31c365470c","Type":"ContainerDied","Data":"ecd5b79df47af79b59ea82915ba36311abddb4f519a85b54ab8eeeed605b953f"} Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.119409 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.233559 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovncontroller-config-0\") pod \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.233670 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw4dq\" (UniqueName: \"kubernetes.io/projected/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-kube-api-access-qw4dq\") pod \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.233693 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovn-combined-ca-bundle\") pod \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.233749 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-inventory\") pod \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.233879 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ssh-key\") pod \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\" (UID: \"ffe0bf53-0bbb-45ac-96b3-fa31c365470c\") " Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.241000 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-kube-api-access-qw4dq" (OuterVolumeSpecName: "kube-api-access-qw4dq") pod "ffe0bf53-0bbb-45ac-96b3-fa31c365470c" (UID: "ffe0bf53-0bbb-45ac-96b3-fa31c365470c"). InnerVolumeSpecName "kube-api-access-qw4dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.242506 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ffe0bf53-0bbb-45ac-96b3-fa31c365470c" (UID: "ffe0bf53-0bbb-45ac-96b3-fa31c365470c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.275262 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-inventory" (OuterVolumeSpecName: "inventory") pod "ffe0bf53-0bbb-45ac-96b3-fa31c365470c" (UID: "ffe0bf53-0bbb-45ac-96b3-fa31c365470c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.281198 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ffe0bf53-0bbb-45ac-96b3-fa31c365470c" (UID: "ffe0bf53-0bbb-45ac-96b3-fa31c365470c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.283867 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffe0bf53-0bbb-45ac-96b3-fa31c365470c" (UID: "ffe0bf53-0bbb-45ac-96b3-fa31c365470c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.335851 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.335897 4669 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.335914 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw4dq\" (UniqueName: \"kubernetes.io/projected/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-kube-api-access-qw4dq\") on node \"crc\" DevicePath \"\"" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.335927 4669 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.335940 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffe0bf53-0bbb-45ac-96b3-fa31c365470c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.659792 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" event={"ID":"ffe0bf53-0bbb-45ac-96b3-fa31c365470c","Type":"ContainerDied","Data":"5de5c6358ad2fc22fe93ba8b11eee513c70e38c981497d5609aad5d6347b3a52"} Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.659840 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5de5c6358ad2fc22fe93ba8b11eee513c70e38c981497d5609aad5d6347b3a52" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.659867 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xmvtv" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.757753 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm"] Oct 01 12:02:59 crc kubenswrapper[4669]: E1001 12:02:59.758257 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe0bf53-0bbb-45ac-96b3-fa31c365470c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.758278 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe0bf53-0bbb-45ac-96b3-fa31c365470c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.758512 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe0bf53-0bbb-45ac-96b3-fa31c365470c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.759365 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.762143 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.762202 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.763065 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.763392 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.763854 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.772094 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm"] Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.773490 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.850437 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.850549 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.850636 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.850693 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98sn\" (UniqueName: \"kubernetes.io/projected/09c6e280-6373-44f6-ad9b-fe24fe56e738-kube-api-access-p98sn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.850760 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.850791 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.952313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.952378 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p98sn\" (UniqueName: \"kubernetes.io/projected/09c6e280-6373-44f6-ad9b-fe24fe56e738-kube-api-access-p98sn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.952421 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.952441 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.952569 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.952626 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.957618 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.957618 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.957631 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.958338 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.958968 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:02:59 crc kubenswrapper[4669]: I1001 12:02:59.973187 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p98sn\" (UniqueName: \"kubernetes.io/projected/09c6e280-6373-44f6-ad9b-fe24fe56e738-kube-api-access-p98sn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:03:00 crc kubenswrapper[4669]: I1001 12:03:00.077161 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:03:00 crc kubenswrapper[4669]: I1001 12:03:00.581269 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm"] Oct 01 12:03:00 crc kubenswrapper[4669]: W1001 12:03:00.585275 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09c6e280_6373_44f6_ad9b_fe24fe56e738.slice/crio-440ca3876b4e48d2a58c7bc884c2d7667d8777c987b2b19ffd3bc30bd57b49c2 WatchSource:0}: Error finding container 440ca3876b4e48d2a58c7bc884c2d7667d8777c987b2b19ffd3bc30bd57b49c2: Status 404 returned error can't find the container with id 440ca3876b4e48d2a58c7bc884c2d7667d8777c987b2b19ffd3bc30bd57b49c2 Oct 01 12:03:00 crc kubenswrapper[4669]: I1001 12:03:00.669395 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" event={"ID":"09c6e280-6373-44f6-ad9b-fe24fe56e738","Type":"ContainerStarted","Data":"440ca3876b4e48d2a58c7bc884c2d7667d8777c987b2b19ffd3bc30bd57b49c2"} Oct 01 12:03:01 crc kubenswrapper[4669]: I1001 12:03:01.680486 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" event={"ID":"09c6e280-6373-44f6-ad9b-fe24fe56e738","Type":"ContainerStarted","Data":"fc208c6a80da5758d28513a9f37e379438682a0128beb866a538aec9adaa6655"} Oct 01 12:03:01 crc kubenswrapper[4669]: I1001 12:03:01.705811 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" podStartSLOduration=2.167300774 podStartE2EDuration="2.705782885s" podCreationTimestamp="2025-10-01 12:02:59 +0000 UTC" firstStartedPulling="2025-10-01 12:03:00.589678777 +0000 UTC m=+2071.689243754" lastFinishedPulling="2025-10-01 12:03:01.128160888 +0000 UTC m=+2072.227725865" observedRunningTime="2025-10-01 12:03:01.696862824 +0000 UTC m=+2072.796427811" watchObservedRunningTime="2025-10-01 12:03:01.705782885 +0000 UTC m=+2072.805347852" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.798932 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jgbbl"] Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.803920 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.815731 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgbbl"] Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.832574 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpfw\" (UniqueName: \"kubernetes.io/projected/f446e5df-424b-4437-89c4-89f782327ec5-kube-api-access-4hpfw\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.832987 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-utilities\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.833247 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-catalog-content\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.936136 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-utilities\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.936274 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-catalog-content\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.936340 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpfw\" (UniqueName: \"kubernetes.io/projected/f446e5df-424b-4437-89c4-89f782327ec5-kube-api-access-4hpfw\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.936907 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-utilities\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.937278 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-catalog-content\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:29 crc kubenswrapper[4669]: I1001 12:03:29.959652 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpfw\" (UniqueName: \"kubernetes.io/projected/f446e5df-424b-4437-89c4-89f782327ec5-kube-api-access-4hpfw\") pod \"certified-operators-jgbbl\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:30 crc kubenswrapper[4669]: I1001 12:03:30.136217 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:30 crc kubenswrapper[4669]: W1001 12:03:30.700062 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf446e5df_424b_4437_89c4_89f782327ec5.slice/crio-079d51071898500b5b010b03d2fb3e434e6c79f19807bded745d402d51a200a3 WatchSource:0}: Error finding container 079d51071898500b5b010b03d2fb3e434e6c79f19807bded745d402d51a200a3: Status 404 returned error can't find the container with id 079d51071898500b5b010b03d2fb3e434e6c79f19807bded745d402d51a200a3 Oct 01 12:03:30 crc kubenswrapper[4669]: I1001 12:03:30.701608 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgbbl"] Oct 01 12:03:31 crc kubenswrapper[4669]: I1001 12:03:31.068859 4669 generic.go:334] "Generic (PLEG): container finished" podID="f446e5df-424b-4437-89c4-89f782327ec5" containerID="9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d" exitCode=0 Oct 01 12:03:31 crc kubenswrapper[4669]: I1001 12:03:31.068980 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgbbl" event={"ID":"f446e5df-424b-4437-89c4-89f782327ec5","Type":"ContainerDied","Data":"9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d"} Oct 01 12:03:31 crc kubenswrapper[4669]: I1001 12:03:31.069404 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgbbl" event={"ID":"f446e5df-424b-4437-89c4-89f782327ec5","Type":"ContainerStarted","Data":"079d51071898500b5b010b03d2fb3e434e6c79f19807bded745d402d51a200a3"} Oct 01 12:03:33 crc kubenswrapper[4669]: I1001 12:03:33.092577 4669 generic.go:334] "Generic (PLEG): container finished" podID="f446e5df-424b-4437-89c4-89f782327ec5" containerID="c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165" exitCode=0 Oct 01 12:03:33 crc kubenswrapper[4669]: I1001 12:03:33.092649 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgbbl" event={"ID":"f446e5df-424b-4437-89c4-89f782327ec5","Type":"ContainerDied","Data":"c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165"} Oct 01 12:03:34 crc kubenswrapper[4669]: I1001 12:03:34.109683 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgbbl" event={"ID":"f446e5df-424b-4437-89c4-89f782327ec5","Type":"ContainerStarted","Data":"3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875"} Oct 01 12:03:34 crc kubenswrapper[4669]: I1001 12:03:34.147188 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jgbbl" podStartSLOduration=2.743451505 podStartE2EDuration="5.147163865s" podCreationTimestamp="2025-10-01 12:03:29 +0000 UTC" firstStartedPulling="2025-10-01 12:03:31.070320719 +0000 UTC m=+2102.169885696" lastFinishedPulling="2025-10-01 12:03:33.474033069 +0000 UTC m=+2104.573598056" observedRunningTime="2025-10-01 12:03:34.139230649 +0000 UTC m=+2105.238795646" watchObservedRunningTime="2025-10-01 12:03:34.147163865 +0000 UTC m=+2105.246728852" Oct 01 12:03:40 crc kubenswrapper[4669]: I1001 12:03:40.137404 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:40 crc kubenswrapper[4669]: I1001 12:03:40.139333 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:40 crc kubenswrapper[4669]: I1001 12:03:40.208498 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:40 crc kubenswrapper[4669]: I1001 12:03:40.279201 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:40 crc kubenswrapper[4669]: I1001 12:03:40.454608 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgbbl"] Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.191627 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jgbbl" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="registry-server" containerID="cri-o://3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875" gracePeriod=2 Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.741540 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.851064 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-utilities\") pod \"f446e5df-424b-4437-89c4-89f782327ec5\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.851178 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpfw\" (UniqueName: \"kubernetes.io/projected/f446e5df-424b-4437-89c4-89f782327ec5-kube-api-access-4hpfw\") pod \"f446e5df-424b-4437-89c4-89f782327ec5\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.851400 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-catalog-content\") pod \"f446e5df-424b-4437-89c4-89f782327ec5\" (UID: \"f446e5df-424b-4437-89c4-89f782327ec5\") " Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.853302 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-utilities" (OuterVolumeSpecName: "utilities") pod "f446e5df-424b-4437-89c4-89f782327ec5" (UID: "f446e5df-424b-4437-89c4-89f782327ec5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.861964 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bqfp5"] Oct 01 12:03:42 crc kubenswrapper[4669]: E1001 12:03:42.862662 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="extract-utilities" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.862779 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="extract-utilities" Oct 01 12:03:42 crc kubenswrapper[4669]: E1001 12:03:42.862883 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="extract-content" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.862962 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="extract-content" Oct 01 12:03:42 crc kubenswrapper[4669]: E1001 12:03:42.863047 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="registry-server" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.863152 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="registry-server" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.863456 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="f446e5df-424b-4437-89c4-89f782327ec5" containerName="registry-server" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.865298 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.866971 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f446e5df-424b-4437-89c4-89f782327ec5-kube-api-access-4hpfw" (OuterVolumeSpecName: "kube-api-access-4hpfw") pod "f446e5df-424b-4437-89c4-89f782327ec5" (UID: "f446e5df-424b-4437-89c4-89f782327ec5"). InnerVolumeSpecName "kube-api-access-4hpfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.880716 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqfp5"] Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.922367 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f446e5df-424b-4437-89c4-89f782327ec5" (UID: "f446e5df-424b-4437-89c4-89f782327ec5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.954327 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-catalog-content\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.954722 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-utilities\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.954860 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9gpq\" (UniqueName: \"kubernetes.io/projected/a48df801-93a1-4aee-a59a-00af2499de59-kube-api-access-f9gpq\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.954974 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.955065 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f446e5df-424b-4437-89c4-89f782327ec5-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:42 crc kubenswrapper[4669]: I1001 12:03:42.955259 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpfw\" (UniqueName: \"kubernetes.io/projected/f446e5df-424b-4437-89c4-89f782327ec5-kube-api-access-4hpfw\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.057281 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-utilities\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.057368 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9gpq\" (UniqueName: \"kubernetes.io/projected/a48df801-93a1-4aee-a59a-00af2499de59-kube-api-access-f9gpq\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.057414 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-catalog-content\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.057954 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-catalog-content\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.058152 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-utilities\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.085014 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9gpq\" (UniqueName: \"kubernetes.io/projected/a48df801-93a1-4aee-a59a-00af2499de59-kube-api-access-f9gpq\") pod \"redhat-marketplace-bqfp5\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.239758 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.248218 4669 generic.go:334] "Generic (PLEG): container finished" podID="f446e5df-424b-4437-89c4-89f782327ec5" containerID="3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875" exitCode=0 Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.248635 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgbbl" event={"ID":"f446e5df-424b-4437-89c4-89f782327ec5","Type":"ContainerDied","Data":"3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875"} Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.248677 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgbbl" event={"ID":"f446e5df-424b-4437-89c4-89f782327ec5","Type":"ContainerDied","Data":"079d51071898500b5b010b03d2fb3e434e6c79f19807bded745d402d51a200a3"} Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.248716 4669 scope.go:117] "RemoveContainer" containerID="3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.252570 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgbbl" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.293028 4669 scope.go:117] "RemoveContainer" containerID="c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.313936 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgbbl"] Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.322420 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jgbbl"] Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.330507 4669 scope.go:117] "RemoveContainer" containerID="9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.406684 4669 scope.go:117] "RemoveContainer" containerID="3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875" Oct 01 12:03:43 crc kubenswrapper[4669]: E1001 12:03:43.407266 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875\": container with ID starting with 3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875 not found: ID does not exist" containerID="3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.407293 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875"} err="failed to get container status \"3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875\": rpc error: code = NotFound desc = could not find container \"3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875\": container with ID starting with 3631ef5dca397a8aae255ee84b146eded548ea0c849143d27758a61c654a1875 not found: ID does not exist" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.407314 4669 scope.go:117] "RemoveContainer" containerID="c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165" Oct 01 12:03:43 crc kubenswrapper[4669]: E1001 12:03:43.407529 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165\": container with ID starting with c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165 not found: ID does not exist" containerID="c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.407549 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165"} err="failed to get container status \"c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165\": rpc error: code = NotFound desc = could not find container \"c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165\": container with ID starting with c7e80ed092d7faa9ca1b609cd77207485c509f292c57919518c5ac90f33ad165 not found: ID does not exist" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.407562 4669 scope.go:117] "RemoveContainer" containerID="9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d" Oct 01 12:03:43 crc kubenswrapper[4669]: E1001 12:03:43.407738 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d\": container with ID starting with 9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d not found: ID does not exist" containerID="9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.407754 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d"} err="failed to get container status \"9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d\": rpc error: code = NotFound desc = could not find container \"9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d\": container with ID starting with 9b9ac4d9dc58217a874e3cd5862c068d260dd4428e7f398a3fa53047bca6c65d not found: ID does not exist" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.664472 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f446e5df-424b-4437-89c4-89f782327ec5" path="/var/lib/kubelet/pods/f446e5df-424b-4437-89c4-89f782327ec5/volumes" Oct 01 12:03:43 crc kubenswrapper[4669]: I1001 12:03:43.757625 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqfp5"] Oct 01 12:03:43 crc kubenswrapper[4669]: W1001 12:03:43.765126 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda48df801_93a1_4aee_a59a_00af2499de59.slice/crio-d2c963ae30c3322aa5c91e77c188419591fdccd8e42ab947e0f6932db580d2d6 WatchSource:0}: Error finding container d2c963ae30c3322aa5c91e77c188419591fdccd8e42ab947e0f6932db580d2d6: Status 404 returned error can't find the container with id d2c963ae30c3322aa5c91e77c188419591fdccd8e42ab947e0f6932db580d2d6 Oct 01 12:03:44 crc kubenswrapper[4669]: I1001 12:03:44.262758 4669 generic.go:334] "Generic (PLEG): container finished" podID="a48df801-93a1-4aee-a59a-00af2499de59" containerID="d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09" exitCode=0 Oct 01 12:03:44 crc kubenswrapper[4669]: I1001 12:03:44.262837 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqfp5" event={"ID":"a48df801-93a1-4aee-a59a-00af2499de59","Type":"ContainerDied","Data":"d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09"} Oct 01 12:03:44 crc kubenswrapper[4669]: I1001 12:03:44.263274 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqfp5" event={"ID":"a48df801-93a1-4aee-a59a-00af2499de59","Type":"ContainerStarted","Data":"d2c963ae30c3322aa5c91e77c188419591fdccd8e42ab947e0f6932db580d2d6"} Oct 01 12:03:46 crc kubenswrapper[4669]: I1001 12:03:46.294730 4669 generic.go:334] "Generic (PLEG): container finished" podID="a48df801-93a1-4aee-a59a-00af2499de59" containerID="843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812" exitCode=0 Oct 01 12:03:46 crc kubenswrapper[4669]: I1001 12:03:46.294821 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqfp5" event={"ID":"a48df801-93a1-4aee-a59a-00af2499de59","Type":"ContainerDied","Data":"843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812"} Oct 01 12:03:47 crc kubenswrapper[4669]: I1001 12:03:47.309676 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqfp5" event={"ID":"a48df801-93a1-4aee-a59a-00af2499de59","Type":"ContainerStarted","Data":"44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd"} Oct 01 12:03:47 crc kubenswrapper[4669]: I1001 12:03:47.345396 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bqfp5" podStartSLOduration=2.789588494 podStartE2EDuration="5.345352791s" podCreationTimestamp="2025-10-01 12:03:42 +0000 UTC" firstStartedPulling="2025-10-01 12:03:44.265631863 +0000 UTC m=+2115.365196840" lastFinishedPulling="2025-10-01 12:03:46.82139615 +0000 UTC m=+2117.920961137" observedRunningTime="2025-10-01 12:03:47.334839021 +0000 UTC m=+2118.434404018" watchObservedRunningTime="2025-10-01 12:03:47.345352791 +0000 UTC m=+2118.444917768" Oct 01 12:03:53 crc kubenswrapper[4669]: I1001 12:03:53.241783 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:53 crc kubenswrapper[4669]: I1001 12:03:53.242519 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:53 crc kubenswrapper[4669]: I1001 12:03:53.318714 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:53 crc kubenswrapper[4669]: I1001 12:03:53.428932 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:53 crc kubenswrapper[4669]: I1001 12:03:53.564048 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqfp5"] Oct 01 12:03:55 crc kubenswrapper[4669]: I1001 12:03:55.395762 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bqfp5" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="registry-server" containerID="cri-o://44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd" gracePeriod=2 Oct 01 12:03:55 crc kubenswrapper[4669]: I1001 12:03:55.880509 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:55 crc kubenswrapper[4669]: I1001 12:03:55.983324 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-catalog-content\") pod \"a48df801-93a1-4aee-a59a-00af2499de59\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " Oct 01 12:03:55 crc kubenswrapper[4669]: I1001 12:03:55.983415 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-utilities\") pod \"a48df801-93a1-4aee-a59a-00af2499de59\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " Oct 01 12:03:55 crc kubenswrapper[4669]: I1001 12:03:55.983471 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9gpq\" (UniqueName: \"kubernetes.io/projected/a48df801-93a1-4aee-a59a-00af2499de59-kube-api-access-f9gpq\") pod \"a48df801-93a1-4aee-a59a-00af2499de59\" (UID: \"a48df801-93a1-4aee-a59a-00af2499de59\") " Oct 01 12:03:55 crc kubenswrapper[4669]: I1001 12:03:55.984438 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-utilities" (OuterVolumeSpecName: "utilities") pod "a48df801-93a1-4aee-a59a-00af2499de59" (UID: "a48df801-93a1-4aee-a59a-00af2499de59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.002095 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48df801-93a1-4aee-a59a-00af2499de59-kube-api-access-f9gpq" (OuterVolumeSpecName: "kube-api-access-f9gpq") pod "a48df801-93a1-4aee-a59a-00af2499de59" (UID: "a48df801-93a1-4aee-a59a-00af2499de59"). InnerVolumeSpecName "kube-api-access-f9gpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.004501 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a48df801-93a1-4aee-a59a-00af2499de59" (UID: "a48df801-93a1-4aee-a59a-00af2499de59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.085376 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.085736 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9gpq\" (UniqueName: \"kubernetes.io/projected/a48df801-93a1-4aee-a59a-00af2499de59-kube-api-access-f9gpq\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.085752 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48df801-93a1-4aee-a59a-00af2499de59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.413049 4669 generic.go:334] "Generic (PLEG): container finished" podID="09c6e280-6373-44f6-ad9b-fe24fe56e738" containerID="fc208c6a80da5758d28513a9f37e379438682a0128beb866a538aec9adaa6655" exitCode=0 Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.413298 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" event={"ID":"09c6e280-6373-44f6-ad9b-fe24fe56e738","Type":"ContainerDied","Data":"fc208c6a80da5758d28513a9f37e379438682a0128beb866a538aec9adaa6655"} Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.419239 4669 generic.go:334] "Generic (PLEG): container finished" podID="a48df801-93a1-4aee-a59a-00af2499de59" containerID="44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd" exitCode=0 Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.419354 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqfp5" event={"ID":"a48df801-93a1-4aee-a59a-00af2499de59","Type":"ContainerDied","Data":"44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd"} Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.419407 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqfp5" event={"ID":"a48df801-93a1-4aee-a59a-00af2499de59","Type":"ContainerDied","Data":"d2c963ae30c3322aa5c91e77c188419591fdccd8e42ab947e0f6932db580d2d6"} Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.419445 4669 scope.go:117] "RemoveContainer" containerID="44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.419814 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqfp5" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.478578 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqfp5"] Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.479492 4669 scope.go:117] "RemoveContainer" containerID="843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.484544 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqfp5"] Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.501484 4669 scope.go:117] "RemoveContainer" containerID="d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.554137 4669 scope.go:117] "RemoveContainer" containerID="44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd" Oct 01 12:03:56 crc kubenswrapper[4669]: E1001 12:03:56.554989 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd\": container with ID starting with 44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd not found: ID does not exist" containerID="44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.555032 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd"} err="failed to get container status \"44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd\": rpc error: code = NotFound desc = could not find container \"44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd\": container with ID starting with 44ad0a38b81725bec59906024313a90d2d807f063b0b97da3f70ffc26afc6ddd not found: ID does not exist" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.555054 4669 scope.go:117] "RemoveContainer" containerID="843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812" Oct 01 12:03:56 crc kubenswrapper[4669]: E1001 12:03:56.556660 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812\": container with ID starting with 843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812 not found: ID does not exist" containerID="843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.556734 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812"} err="failed to get container status \"843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812\": rpc error: code = NotFound desc = could not find container \"843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812\": container with ID starting with 843fac41844aa8d1ddbbc7e30c09ffe64768b51ee7d7de129d0a166e8fb2c812 not found: ID does not exist" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.556778 4669 scope.go:117] "RemoveContainer" containerID="d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09" Oct 01 12:03:56 crc kubenswrapper[4669]: E1001 12:03:56.557188 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09\": container with ID starting with d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09 not found: ID does not exist" containerID="d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09" Oct 01 12:03:56 crc kubenswrapper[4669]: I1001 12:03:56.557239 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09"} err="failed to get container status \"d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09\": rpc error: code = NotFound desc = could not find container \"d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09\": container with ID starting with d3988fe147369c7a442662a297696333f93882d738d5d959504567bcc5e63c09 not found: ID does not exist" Oct 01 12:03:57 crc kubenswrapper[4669]: I1001 12:03:57.657606 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a48df801-93a1-4aee-a59a-00af2499de59" path="/var/lib/kubelet/pods/a48df801-93a1-4aee-a59a-00af2499de59/volumes" Oct 01 12:03:57 crc kubenswrapper[4669]: I1001 12:03:57.931950 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.026000 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-ovn-metadata-agent-neutron-config-0\") pod \"09c6e280-6373-44f6-ad9b-fe24fe56e738\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.026107 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-metadata-combined-ca-bundle\") pod \"09c6e280-6373-44f6-ad9b-fe24fe56e738\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.026154 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-ssh-key\") pod \"09c6e280-6373-44f6-ad9b-fe24fe56e738\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.026245 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p98sn\" (UniqueName: \"kubernetes.io/projected/09c6e280-6373-44f6-ad9b-fe24fe56e738-kube-api-access-p98sn\") pod \"09c6e280-6373-44f6-ad9b-fe24fe56e738\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.026373 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-nova-metadata-neutron-config-0\") pod \"09c6e280-6373-44f6-ad9b-fe24fe56e738\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.028094 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-inventory\") pod \"09c6e280-6373-44f6-ad9b-fe24fe56e738\" (UID: \"09c6e280-6373-44f6-ad9b-fe24fe56e738\") " Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.033963 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "09c6e280-6373-44f6-ad9b-fe24fe56e738" (UID: "09c6e280-6373-44f6-ad9b-fe24fe56e738"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.043970 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c6e280-6373-44f6-ad9b-fe24fe56e738-kube-api-access-p98sn" (OuterVolumeSpecName: "kube-api-access-p98sn") pod "09c6e280-6373-44f6-ad9b-fe24fe56e738" (UID: "09c6e280-6373-44f6-ad9b-fe24fe56e738"). InnerVolumeSpecName "kube-api-access-p98sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.071961 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "09c6e280-6373-44f6-ad9b-fe24fe56e738" (UID: "09c6e280-6373-44f6-ad9b-fe24fe56e738"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.074228 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "09c6e280-6373-44f6-ad9b-fe24fe56e738" (UID: "09c6e280-6373-44f6-ad9b-fe24fe56e738"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.078351 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "09c6e280-6373-44f6-ad9b-fe24fe56e738" (UID: "09c6e280-6373-44f6-ad9b-fe24fe56e738"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.078711 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-inventory" (OuterVolumeSpecName: "inventory") pod "09c6e280-6373-44f6-ad9b-fe24fe56e738" (UID: "09c6e280-6373-44f6-ad9b-fe24fe56e738"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.131310 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.131359 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p98sn\" (UniqueName: \"kubernetes.io/projected/09c6e280-6373-44f6-ad9b-fe24fe56e738-kube-api-access-p98sn\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.131384 4669 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.131417 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.131437 4669 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.131453 4669 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6e280-6373-44f6-ad9b-fe24fe56e738-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.448706 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" event={"ID":"09c6e280-6373-44f6-ad9b-fe24fe56e738","Type":"ContainerDied","Data":"440ca3876b4e48d2a58c7bc884c2d7667d8777c987b2b19ffd3bc30bd57b49c2"} Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.448750 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440ca3876b4e48d2a58c7bc884c2d7667d8777c987b2b19ffd3bc30bd57b49c2" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.448849 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.585760 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc"] Oct 01 12:03:58 crc kubenswrapper[4669]: E1001 12:03:58.586200 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="registry-server" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.586217 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="registry-server" Oct 01 12:03:58 crc kubenswrapper[4669]: E1001 12:03:58.586260 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="extract-utilities" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.586270 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="extract-utilities" Oct 01 12:03:58 crc kubenswrapper[4669]: E1001 12:03:58.586289 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c6e280-6373-44f6-ad9b-fe24fe56e738" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.586297 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c6e280-6373-44f6-ad9b-fe24fe56e738" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 12:03:58 crc kubenswrapper[4669]: E1001 12:03:58.586314 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="extract-content" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.586320 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="extract-content" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.586485 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c6e280-6373-44f6-ad9b-fe24fe56e738" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.586498 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48df801-93a1-4aee-a59a-00af2499de59" containerName="registry-server" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.587107 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.589580 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.589628 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.589595 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.589845 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.590213 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.603002 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc"] Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.743482 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.743555 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.743678 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.743728 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjs6h\" (UniqueName: \"kubernetes.io/projected/9f57f089-5ea5-4b92-acbb-e14488a50253-kube-api-access-gjs6h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.743794 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.848174 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjs6h\" (UniqueName: \"kubernetes.io/projected/9f57f089-5ea5-4b92-acbb-e14488a50253-kube-api-access-gjs6h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.848999 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.849046 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.849096 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.849248 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.860152 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.867254 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.870650 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.874838 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.888798 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjs6h\" (UniqueName: \"kubernetes.io/projected/9f57f089-5ea5-4b92-acbb-e14488a50253-kube-api-access-gjs6h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:58 crc kubenswrapper[4669]: I1001 12:03:58.903588 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:03:59 crc kubenswrapper[4669]: I1001 12:03:59.532662 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc"] Oct 01 12:04:00 crc kubenswrapper[4669]: I1001 12:04:00.471174 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" event={"ID":"9f57f089-5ea5-4b92-acbb-e14488a50253","Type":"ContainerStarted","Data":"69c322cb0f6d577c357bd2c7852b15bae494cdd20f91bd070fbffbeb441d12b4"} Oct 01 12:04:01 crc kubenswrapper[4669]: I1001 12:04:01.481554 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" event={"ID":"9f57f089-5ea5-4b92-acbb-e14488a50253","Type":"ContainerStarted","Data":"b7dbce73c3f9fd68dd91d7467dac47e4a535c486575dd445c75c18fc3580f823"} Oct 01 12:04:01 crc kubenswrapper[4669]: I1001 12:04:01.507574 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" podStartSLOduration=2.7722928060000003 podStartE2EDuration="3.507550536s" podCreationTimestamp="2025-10-01 12:03:58 +0000 UTC" firstStartedPulling="2025-10-01 12:03:59.550343255 +0000 UTC m=+2130.649908232" lastFinishedPulling="2025-10-01 12:04:00.285600965 +0000 UTC m=+2131.385165962" observedRunningTime="2025-10-01 12:04:01.498767459 +0000 UTC m=+2132.598332436" watchObservedRunningTime="2025-10-01 12:04:01.507550536 +0000 UTC m=+2132.607115513" Oct 01 12:04:01 crc kubenswrapper[4669]: I1001 12:04:01.863583 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:04:01 crc kubenswrapper[4669]: I1001 12:04:01.863674 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.675502 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tgrkc"] Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.678739 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.701148 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgrkc"] Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.784255 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-catalog-content\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.784337 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-utilities\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.784440 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58d5\" (UniqueName: \"kubernetes.io/projected/c8cae08f-22ef-429b-81b7-6c555e602a07-kube-api-access-w58d5\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.886328 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-catalog-content\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.886394 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-utilities\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.886466 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58d5\" (UniqueName: \"kubernetes.io/projected/c8cae08f-22ef-429b-81b7-6c555e602a07-kube-api-access-w58d5\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.887012 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-catalog-content\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.887258 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-utilities\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:08 crc kubenswrapper[4669]: I1001 12:04:08.913234 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58d5\" (UniqueName: \"kubernetes.io/projected/c8cae08f-22ef-429b-81b7-6c555e602a07-kube-api-access-w58d5\") pod \"redhat-operators-tgrkc\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:09 crc kubenswrapper[4669]: I1001 12:04:09.013704 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:09 crc kubenswrapper[4669]: I1001 12:04:09.517929 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgrkc"] Oct 01 12:04:09 crc kubenswrapper[4669]: I1001 12:04:09.578343 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgrkc" event={"ID":"c8cae08f-22ef-429b-81b7-6c555e602a07","Type":"ContainerStarted","Data":"a07914dd4ba0db34d4cbd68b82c3a80e384e736402ba4fa0f3173cb3eeced365"} Oct 01 12:04:10 crc kubenswrapper[4669]: I1001 12:04:10.593386 4669 generic.go:334] "Generic (PLEG): container finished" podID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerID="8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341" exitCode=0 Oct 01 12:04:10 crc kubenswrapper[4669]: I1001 12:04:10.593475 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgrkc" event={"ID":"c8cae08f-22ef-429b-81b7-6c555e602a07","Type":"ContainerDied","Data":"8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341"} Oct 01 12:04:12 crc kubenswrapper[4669]: I1001 12:04:12.622306 4669 generic.go:334] "Generic (PLEG): container finished" podID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerID="5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03" exitCode=0 Oct 01 12:04:12 crc kubenswrapper[4669]: I1001 12:04:12.622403 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgrkc" event={"ID":"c8cae08f-22ef-429b-81b7-6c555e602a07","Type":"ContainerDied","Data":"5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03"} Oct 01 12:04:13 crc kubenswrapper[4669]: I1001 12:04:13.639418 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgrkc" event={"ID":"c8cae08f-22ef-429b-81b7-6c555e602a07","Type":"ContainerStarted","Data":"f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97"} Oct 01 12:04:13 crc kubenswrapper[4669]: I1001 12:04:13.665827 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tgrkc" podStartSLOduration=3.147993178 podStartE2EDuration="5.665803316s" podCreationTimestamp="2025-10-01 12:04:08 +0000 UTC" firstStartedPulling="2025-10-01 12:04:10.596404784 +0000 UTC m=+2141.695969761" lastFinishedPulling="2025-10-01 12:04:13.114214922 +0000 UTC m=+2144.213779899" observedRunningTime="2025-10-01 12:04:13.660380902 +0000 UTC m=+2144.759945879" watchObservedRunningTime="2025-10-01 12:04:13.665803316 +0000 UTC m=+2144.765368293" Oct 01 12:04:19 crc kubenswrapper[4669]: I1001 12:04:19.013971 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:19 crc kubenswrapper[4669]: I1001 12:04:19.014756 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:19 crc kubenswrapper[4669]: I1001 12:04:19.093735 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:19 crc kubenswrapper[4669]: I1001 12:04:19.768143 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:19 crc kubenswrapper[4669]: I1001 12:04:19.874865 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgrkc"] Oct 01 12:04:21 crc kubenswrapper[4669]: I1001 12:04:21.718981 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tgrkc" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="registry-server" containerID="cri-o://f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97" gracePeriod=2 Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.189623 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.237021 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w58d5\" (UniqueName: \"kubernetes.io/projected/c8cae08f-22ef-429b-81b7-6c555e602a07-kube-api-access-w58d5\") pod \"c8cae08f-22ef-429b-81b7-6c555e602a07\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.237477 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-catalog-content\") pod \"c8cae08f-22ef-429b-81b7-6c555e602a07\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.237737 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-utilities\") pod \"c8cae08f-22ef-429b-81b7-6c555e602a07\" (UID: \"c8cae08f-22ef-429b-81b7-6c555e602a07\") " Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.238399 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-utilities" (OuterVolumeSpecName: "utilities") pod "c8cae08f-22ef-429b-81b7-6c555e602a07" (UID: "c8cae08f-22ef-429b-81b7-6c555e602a07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.244745 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cae08f-22ef-429b-81b7-6c555e602a07-kube-api-access-w58d5" (OuterVolumeSpecName: "kube-api-access-w58d5") pod "c8cae08f-22ef-429b-81b7-6c555e602a07" (UID: "c8cae08f-22ef-429b-81b7-6c555e602a07"). InnerVolumeSpecName "kube-api-access-w58d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.327135 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8cae08f-22ef-429b-81b7-6c555e602a07" (UID: "c8cae08f-22ef-429b-81b7-6c555e602a07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.339191 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.339220 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w58d5\" (UniqueName: \"kubernetes.io/projected/c8cae08f-22ef-429b-81b7-6c555e602a07-kube-api-access-w58d5\") on node \"crc\" DevicePath \"\"" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.339232 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8cae08f-22ef-429b-81b7-6c555e602a07-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.734756 4669 generic.go:334] "Generic (PLEG): container finished" podID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerID="f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97" exitCode=0 Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.734813 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgrkc" event={"ID":"c8cae08f-22ef-429b-81b7-6c555e602a07","Type":"ContainerDied","Data":"f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97"} Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.734892 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgrkc" event={"ID":"c8cae08f-22ef-429b-81b7-6c555e602a07","Type":"ContainerDied","Data":"a07914dd4ba0db34d4cbd68b82c3a80e384e736402ba4fa0f3173cb3eeced365"} Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.734897 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgrkc" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.734928 4669 scope.go:117] "RemoveContainer" containerID="f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.771615 4669 scope.go:117] "RemoveContainer" containerID="5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.797103 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgrkc"] Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.816415 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tgrkc"] Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.831791 4669 scope.go:117] "RemoveContainer" containerID="8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.866944 4669 scope.go:117] "RemoveContainer" containerID="f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97" Oct 01 12:04:22 crc kubenswrapper[4669]: E1001 12:04:22.867688 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97\": container with ID starting with f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97 not found: ID does not exist" containerID="f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.867728 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97"} err="failed to get container status \"f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97\": rpc error: code = NotFound desc = could not find container \"f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97\": container with ID starting with f210b4574136ea50d36eccb91fef0143e51bcb77c1959fac9b42a19fa1113a97 not found: ID does not exist" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.867759 4669 scope.go:117] "RemoveContainer" containerID="5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03" Oct 01 12:04:22 crc kubenswrapper[4669]: E1001 12:04:22.868427 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03\": container with ID starting with 5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03 not found: ID does not exist" containerID="5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.868549 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03"} err="failed to get container status \"5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03\": rpc error: code = NotFound desc = could not find container \"5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03\": container with ID starting with 5f1b263b22d4aece97837b08521ead1e59b2d55c3472bf89d572e1f8c8cdfa03 not found: ID does not exist" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.868653 4669 scope.go:117] "RemoveContainer" containerID="8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341" Oct 01 12:04:22 crc kubenswrapper[4669]: E1001 12:04:22.869344 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341\": container with ID starting with 8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341 not found: ID does not exist" containerID="8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341" Oct 01 12:04:22 crc kubenswrapper[4669]: I1001 12:04:22.869432 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341"} err="failed to get container status \"8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341\": rpc error: code = NotFound desc = could not find container \"8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341\": container with ID starting with 8e104160071b91c2f13a76a31be72ef4419742b1c7741cc4f7fc7aa19a368341 not found: ID does not exist" Oct 01 12:04:23 crc kubenswrapper[4669]: I1001 12:04:23.665055 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" path="/var/lib/kubelet/pods/c8cae08f-22ef-429b-81b7-6c555e602a07/volumes" Oct 01 12:04:31 crc kubenswrapper[4669]: I1001 12:04:31.863898 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:04:31 crc kubenswrapper[4669]: I1001 12:04:31.865189 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.881541 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8r5wx"] Oct 01 12:04:46 crc kubenswrapper[4669]: E1001 12:04:46.883318 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="extract-content" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.883354 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="extract-content" Oct 01 12:04:46 crc kubenswrapper[4669]: E1001 12:04:46.883404 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="extract-utilities" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.883425 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="extract-utilities" Oct 01 12:04:46 crc kubenswrapper[4669]: E1001 12:04:46.883498 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="registry-server" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.883520 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="registry-server" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.883989 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cae08f-22ef-429b-81b7-6c555e602a07" containerName="registry-server" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.886713 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:46 crc kubenswrapper[4669]: I1001 12:04:46.902226 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8r5wx"] Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.067251 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-utilities\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.067308 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-catalog-content\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.067374 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hksx7\" (UniqueName: \"kubernetes.io/projected/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-kube-api-access-hksx7\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.169002 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-utilities\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.169064 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-catalog-content\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.169181 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hksx7\" (UniqueName: \"kubernetes.io/projected/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-kube-api-access-hksx7\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.170142 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-utilities\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.170291 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-catalog-content\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.193363 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hksx7\" (UniqueName: \"kubernetes.io/projected/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-kube-api-access-hksx7\") pod \"community-operators-8r5wx\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.222829 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:47 crc kubenswrapper[4669]: I1001 12:04:47.761517 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8r5wx"] Oct 01 12:04:48 crc kubenswrapper[4669]: I1001 12:04:48.015551 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerStarted","Data":"eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7"} Oct 01 12:04:48 crc kubenswrapper[4669]: I1001 12:04:48.015616 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerStarted","Data":"92dc504df1d3bb8935dc80e67a215afea8673d1fbd7c6f9355894203e1713d4a"} Oct 01 12:04:49 crc kubenswrapper[4669]: I1001 12:04:49.028782 4669 generic.go:334] "Generic (PLEG): container finished" podID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerID="eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7" exitCode=0 Oct 01 12:04:49 crc kubenswrapper[4669]: I1001 12:04:49.028842 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerDied","Data":"eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7"} Oct 01 12:04:50 crc kubenswrapper[4669]: I1001 12:04:50.050243 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerStarted","Data":"b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac"} Oct 01 12:04:51 crc kubenswrapper[4669]: I1001 12:04:51.066687 4669 generic.go:334] "Generic (PLEG): container finished" podID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerID="b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac" exitCode=0 Oct 01 12:04:51 crc kubenswrapper[4669]: I1001 12:04:51.066811 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerDied","Data":"b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac"} Oct 01 12:04:52 crc kubenswrapper[4669]: I1001 12:04:52.084030 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerStarted","Data":"6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da"} Oct 01 12:04:52 crc kubenswrapper[4669]: I1001 12:04:52.117238 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8r5wx" podStartSLOduration=3.538461454 podStartE2EDuration="6.117217659s" podCreationTimestamp="2025-10-01 12:04:46 +0000 UTC" firstStartedPulling="2025-10-01 12:04:49.03138286 +0000 UTC m=+2180.130947837" lastFinishedPulling="2025-10-01 12:04:51.610139065 +0000 UTC m=+2182.709704042" observedRunningTime="2025-10-01 12:04:52.110057362 +0000 UTC m=+2183.209622349" watchObservedRunningTime="2025-10-01 12:04:52.117217659 +0000 UTC m=+2183.216782646" Oct 01 12:04:57 crc kubenswrapper[4669]: I1001 12:04:57.223047 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:57 crc kubenswrapper[4669]: I1001 12:04:57.224940 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:57 crc kubenswrapper[4669]: I1001 12:04:57.295693 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:58 crc kubenswrapper[4669]: I1001 12:04:58.208627 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:04:58 crc kubenswrapper[4669]: I1001 12:04:58.261908 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8r5wx"] Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.186449 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8r5wx" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="registry-server" containerID="cri-o://6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da" gracePeriod=2 Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.684005 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.811745 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-catalog-content\") pod \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.811920 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hksx7\" (UniqueName: \"kubernetes.io/projected/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-kube-api-access-hksx7\") pod \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.812114 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-utilities\") pod \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\" (UID: \"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d\") " Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.814176 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-utilities" (OuterVolumeSpecName: "utilities") pod "0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" (UID: "0151c4b4-2cbd-4199-a7ad-47cfdce1db0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.819884 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-kube-api-access-hksx7" (OuterVolumeSpecName: "kube-api-access-hksx7") pod "0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" (UID: "0151c4b4-2cbd-4199-a7ad-47cfdce1db0d"). InnerVolumeSpecName "kube-api-access-hksx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.915181 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hksx7\" (UniqueName: \"kubernetes.io/projected/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-kube-api-access-hksx7\") on node \"crc\" DevicePath \"\"" Oct 01 12:05:00 crc kubenswrapper[4669]: I1001 12:05:00.915874 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.128065 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" (UID: "0151c4b4-2cbd-4199-a7ad-47cfdce1db0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.197743 4669 generic.go:334] "Generic (PLEG): container finished" podID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerID="6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da" exitCode=0 Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.197795 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerDied","Data":"6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da"} Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.197824 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8r5wx" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.197838 4669 scope.go:117] "RemoveContainer" containerID="6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.197826 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8r5wx" event={"ID":"0151c4b4-2cbd-4199-a7ad-47cfdce1db0d","Type":"ContainerDied","Data":"92dc504df1d3bb8935dc80e67a215afea8673d1fbd7c6f9355894203e1713d4a"} Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.221952 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.233798 4669 scope.go:117] "RemoveContainer" containerID="b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.242126 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8r5wx"] Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.252063 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8r5wx"] Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.271849 4669 scope.go:117] "RemoveContainer" containerID="eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.328678 4669 scope.go:117] "RemoveContainer" containerID="6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da" Oct 01 12:05:01 crc kubenswrapper[4669]: E1001 12:05:01.330122 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da\": container with ID starting with 6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da not found: ID does not exist" containerID="6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.330355 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da"} err="failed to get container status \"6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da\": rpc error: code = NotFound desc = could not find container \"6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da\": container with ID starting with 6619686434cda43317dc6ee680ac9abef1c09ff6326b621e80f7c51d20b515da not found: ID does not exist" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.330539 4669 scope.go:117] "RemoveContainer" containerID="b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac" Oct 01 12:05:01 crc kubenswrapper[4669]: E1001 12:05:01.331152 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac\": container with ID starting with b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac not found: ID does not exist" containerID="b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.331201 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac"} err="failed to get container status \"b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac\": rpc error: code = NotFound desc = could not find container \"b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac\": container with ID starting with b33cadffa7d3dfe3bd7e54fa950a3a6e43cad7f76c4d9464c9f79625930c4fac not found: ID does not exist" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.331230 4669 scope.go:117] "RemoveContainer" containerID="eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7" Oct 01 12:05:01 crc kubenswrapper[4669]: E1001 12:05:01.331665 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7\": container with ID starting with eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7 not found: ID does not exist" containerID="eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.331698 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7"} err="failed to get container status \"eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7\": rpc error: code = NotFound desc = could not find container \"eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7\": container with ID starting with eea8b833c9b72911f2ccc9e1227d4e5167c3269759f69c5f1c0892976ace29f7 not found: ID does not exist" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.661426 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" path="/var/lib/kubelet/pods/0151c4b4-2cbd-4199-a7ad-47cfdce1db0d/volumes" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.863711 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.864171 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.864224 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.865182 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2127b7b66421d1ad91fee0262eada6fd44bf5302e446225cc270993aced2bd3f"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:05:01 crc kubenswrapper[4669]: I1001 12:05:01.865259 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://2127b7b66421d1ad91fee0262eada6fd44bf5302e446225cc270993aced2bd3f" gracePeriod=600 Oct 01 12:05:02 crc kubenswrapper[4669]: I1001 12:05:02.211943 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="2127b7b66421d1ad91fee0262eada6fd44bf5302e446225cc270993aced2bd3f" exitCode=0 Oct 01 12:05:02 crc kubenswrapper[4669]: I1001 12:05:02.212009 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"2127b7b66421d1ad91fee0262eada6fd44bf5302e446225cc270993aced2bd3f"} Oct 01 12:05:02 crc kubenswrapper[4669]: I1001 12:05:02.212095 4669 scope.go:117] "RemoveContainer" containerID="a297e0fd09af89de457d3cafd1987cba9f9f23b3ee7b1d49e21e4bc915943b55" Oct 01 12:05:03 crc kubenswrapper[4669]: I1001 12:05:03.232275 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc"} Oct 01 12:07:31 crc kubenswrapper[4669]: I1001 12:07:31.863278 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:07:31 crc kubenswrapper[4669]: I1001 12:07:31.864003 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:08:01 crc kubenswrapper[4669]: I1001 12:08:01.864032 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:08:01 crc kubenswrapper[4669]: I1001 12:08:01.865132 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:08:31 crc kubenswrapper[4669]: I1001 12:08:31.863273 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:08:31 crc kubenswrapper[4669]: I1001 12:08:31.867741 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:08:31 crc kubenswrapper[4669]: I1001 12:08:31.868064 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:08:31 crc kubenswrapper[4669]: I1001 12:08:31.869835 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:08:31 crc kubenswrapper[4669]: I1001 12:08:31.870190 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" gracePeriod=600 Oct 01 12:08:31 crc kubenswrapper[4669]: E1001 12:08:31.998725 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:08:32 crc kubenswrapper[4669]: I1001 12:08:32.832176 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" exitCode=0 Oct 01 12:08:32 crc kubenswrapper[4669]: I1001 12:08:32.832226 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc"} Oct 01 12:08:32 crc kubenswrapper[4669]: I1001 12:08:32.832285 4669 scope.go:117] "RemoveContainer" containerID="2127b7b66421d1ad91fee0262eada6fd44bf5302e446225cc270993aced2bd3f" Oct 01 12:08:32 crc kubenswrapper[4669]: I1001 12:08:32.833315 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:08:32 crc kubenswrapper[4669]: E1001 12:08:32.833700 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:08:37 crc kubenswrapper[4669]: I1001 12:08:37.896956 4669 generic.go:334] "Generic (PLEG): container finished" podID="9f57f089-5ea5-4b92-acbb-e14488a50253" containerID="b7dbce73c3f9fd68dd91d7467dac47e4a535c486575dd445c75c18fc3580f823" exitCode=0 Oct 01 12:08:37 crc kubenswrapper[4669]: I1001 12:08:37.897053 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" event={"ID":"9f57f089-5ea5-4b92-acbb-e14488a50253","Type":"ContainerDied","Data":"b7dbce73c3f9fd68dd91d7467dac47e4a535c486575dd445c75c18fc3580f823"} Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.384636 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.550418 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjs6h\" (UniqueName: \"kubernetes.io/projected/9f57f089-5ea5-4b92-acbb-e14488a50253-kube-api-access-gjs6h\") pod \"9f57f089-5ea5-4b92-acbb-e14488a50253\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.550566 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-ssh-key\") pod \"9f57f089-5ea5-4b92-acbb-e14488a50253\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.550638 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-combined-ca-bundle\") pod \"9f57f089-5ea5-4b92-acbb-e14488a50253\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.550746 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-secret-0\") pod \"9f57f089-5ea5-4b92-acbb-e14488a50253\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.550783 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-inventory\") pod \"9f57f089-5ea5-4b92-acbb-e14488a50253\" (UID: \"9f57f089-5ea5-4b92-acbb-e14488a50253\") " Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.559122 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f57f089-5ea5-4b92-acbb-e14488a50253-kube-api-access-gjs6h" (OuterVolumeSpecName: "kube-api-access-gjs6h") pod "9f57f089-5ea5-4b92-acbb-e14488a50253" (UID: "9f57f089-5ea5-4b92-acbb-e14488a50253"). InnerVolumeSpecName "kube-api-access-gjs6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.559317 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9f57f089-5ea5-4b92-acbb-e14488a50253" (UID: "9f57f089-5ea5-4b92-acbb-e14488a50253"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.582367 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9f57f089-5ea5-4b92-acbb-e14488a50253" (UID: "9f57f089-5ea5-4b92-acbb-e14488a50253"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.582799 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-inventory" (OuterVolumeSpecName: "inventory") pod "9f57f089-5ea5-4b92-acbb-e14488a50253" (UID: "9f57f089-5ea5-4b92-acbb-e14488a50253"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.584519 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f57f089-5ea5-4b92-acbb-e14488a50253" (UID: "9f57f089-5ea5-4b92-acbb-e14488a50253"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.652711 4669 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.652744 4669 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.652753 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.652762 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjs6h\" (UniqueName: \"kubernetes.io/projected/9f57f089-5ea5-4b92-acbb-e14488a50253-kube-api-access-gjs6h\") on node \"crc\" DevicePath \"\"" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.652771 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f57f089-5ea5-4b92-acbb-e14488a50253-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.920819 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" event={"ID":"9f57f089-5ea5-4b92-acbb-e14488a50253","Type":"ContainerDied","Data":"69c322cb0f6d577c357bd2c7852b15bae494cdd20f91bd070fbffbeb441d12b4"} Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.921476 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69c322cb0f6d577c357bd2c7852b15bae494cdd20f91bd070fbffbeb441d12b4" Oct 01 12:08:39 crc kubenswrapper[4669]: I1001 12:08:39.921135 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.070125 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n"] Oct 01 12:08:40 crc kubenswrapper[4669]: E1001 12:08:40.070736 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f57f089-5ea5-4b92-acbb-e14488a50253" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.070764 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f57f089-5ea5-4b92-acbb-e14488a50253" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 12:08:40 crc kubenswrapper[4669]: E1001 12:08:40.070794 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="extract-utilities" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.070806 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="extract-utilities" Oct 01 12:08:40 crc kubenswrapper[4669]: E1001 12:08:40.070825 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="registry-server" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.070832 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="registry-server" Oct 01 12:08:40 crc kubenswrapper[4669]: E1001 12:08:40.070867 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="extract-content" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.070876 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="extract-content" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.071067 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0151c4b4-2cbd-4199-a7ad-47cfdce1db0d" containerName="registry-server" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.071107 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f57f089-5ea5-4b92-acbb-e14488a50253" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.071915 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.081535 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n"] Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.123691 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.123910 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.124015 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.124277 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.124448 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.124741 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.125519 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167136 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167201 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167248 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167274 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167740 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167929 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.167963 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.168072 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4lz\" (UniqueName: \"kubernetes.io/projected/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-kube-api-access-bw4lz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.168178 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270219 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4lz\" (UniqueName: \"kubernetes.io/projected/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-kube-api-access-bw4lz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270337 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270384 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270411 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270443 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270481 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270548 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270585 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.270608 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.272008 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.277220 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.277423 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.277623 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.277754 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.279149 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.279352 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.294550 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.295581 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4lz\" (UniqueName: \"kubernetes.io/projected/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-kube-api-access-bw4lz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zr89n\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:40 crc kubenswrapper[4669]: I1001 12:08:40.458970 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:08:41 crc kubenswrapper[4669]: I1001 12:08:41.033103 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n"] Oct 01 12:08:41 crc kubenswrapper[4669]: I1001 12:08:41.053651 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:08:41 crc kubenswrapper[4669]: I1001 12:08:41.940666 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" event={"ID":"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97","Type":"ContainerStarted","Data":"0acbbc4a9f77350ba3a7d1a8042093dc1cd0eceb4c346fc62ee1fde999f79f59"} Oct 01 12:08:41 crc kubenswrapper[4669]: I1001 12:08:41.941323 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" event={"ID":"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97","Type":"ContainerStarted","Data":"283ce9f928ea79fced274582021727b1a906a3e6d3f147a6c38e773d84a0a6cd"} Oct 01 12:08:41 crc kubenswrapper[4669]: I1001 12:08:41.967250 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" podStartSLOduration=1.431642302 podStartE2EDuration="1.967224741s" podCreationTimestamp="2025-10-01 12:08:40 +0000 UTC" firstStartedPulling="2025-10-01 12:08:41.053200197 +0000 UTC m=+2412.152765184" lastFinishedPulling="2025-10-01 12:08:41.588782646 +0000 UTC m=+2412.688347623" observedRunningTime="2025-10-01 12:08:41.959374906 +0000 UTC m=+2413.058939883" watchObservedRunningTime="2025-10-01 12:08:41.967224741 +0000 UTC m=+2413.066789718" Oct 01 12:08:46 crc kubenswrapper[4669]: I1001 12:08:46.644346 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:08:46 crc kubenswrapper[4669]: E1001 12:08:46.644834 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:08:58 crc kubenswrapper[4669]: I1001 12:08:58.645062 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:08:58 crc kubenswrapper[4669]: E1001 12:08:58.645978 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:09:10 crc kubenswrapper[4669]: I1001 12:09:10.645355 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:09:10 crc kubenswrapper[4669]: E1001 12:09:10.646492 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:09:21 crc kubenswrapper[4669]: I1001 12:09:21.644716 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:09:21 crc kubenswrapper[4669]: E1001 12:09:21.646097 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:09:32 crc kubenswrapper[4669]: I1001 12:09:32.645167 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:09:32 crc kubenswrapper[4669]: E1001 12:09:32.646228 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:09:46 crc kubenswrapper[4669]: I1001 12:09:46.644587 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:09:46 crc kubenswrapper[4669]: E1001 12:09:46.645802 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:09:58 crc kubenswrapper[4669]: I1001 12:09:58.645397 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:09:58 crc kubenswrapper[4669]: E1001 12:09:58.646698 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:10:09 crc kubenswrapper[4669]: I1001 12:10:09.658466 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:10:09 crc kubenswrapper[4669]: E1001 12:10:09.659171 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:10:21 crc kubenswrapper[4669]: I1001 12:10:21.646599 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:10:21 crc kubenswrapper[4669]: E1001 12:10:21.647664 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:10:34 crc kubenswrapper[4669]: I1001 12:10:34.644762 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:10:34 crc kubenswrapper[4669]: E1001 12:10:34.646279 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:10:45 crc kubenswrapper[4669]: I1001 12:10:45.644372 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:10:45 crc kubenswrapper[4669]: E1001 12:10:45.646738 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:10:58 crc kubenswrapper[4669]: I1001 12:10:58.644850 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:10:58 crc kubenswrapper[4669]: E1001 12:10:58.646354 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:11:09 crc kubenswrapper[4669]: I1001 12:11:09.667475 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:11:09 crc kubenswrapper[4669]: E1001 12:11:09.669231 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:11:23 crc kubenswrapper[4669]: I1001 12:11:23.644752 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:11:23 crc kubenswrapper[4669]: E1001 12:11:23.646231 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:11:35 crc kubenswrapper[4669]: I1001 12:11:35.645607 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:11:35 crc kubenswrapper[4669]: E1001 12:11:35.647225 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:11:46 crc kubenswrapper[4669]: I1001 12:11:46.644951 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:11:46 crc kubenswrapper[4669]: E1001 12:11:46.646340 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:11:59 crc kubenswrapper[4669]: I1001 12:11:59.666144 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:11:59 crc kubenswrapper[4669]: E1001 12:11:59.668695 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:12:13 crc kubenswrapper[4669]: I1001 12:12:13.645701 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:12:13 crc kubenswrapper[4669]: E1001 12:12:13.647625 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:12:27 crc kubenswrapper[4669]: I1001 12:12:27.644435 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:12:27 crc kubenswrapper[4669]: E1001 12:12:27.645617 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:12:38 crc kubenswrapper[4669]: I1001 12:12:38.644133 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:12:38 crc kubenswrapper[4669]: E1001 12:12:38.645601 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:12:39 crc kubenswrapper[4669]: I1001 12:12:39.926520 4669 generic.go:334] "Generic (PLEG): container finished" podID="da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" containerID="0acbbc4a9f77350ba3a7d1a8042093dc1cd0eceb4c346fc62ee1fde999f79f59" exitCode=0 Oct 01 12:12:39 crc kubenswrapper[4669]: I1001 12:12:39.926696 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" event={"ID":"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97","Type":"ContainerDied","Data":"0acbbc4a9f77350ba3a7d1a8042093dc1cd0eceb4c346fc62ee1fde999f79f59"} Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.439925 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610009 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-1\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610205 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-0\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610251 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-0\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610300 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-combined-ca-bundle\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610354 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-inventory\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610405 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-1\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610479 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-extra-config-0\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610601 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw4lz\" (UniqueName: \"kubernetes.io/projected/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-kube-api-access-bw4lz\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.610637 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-ssh-key\") pod \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\" (UID: \"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97\") " Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.618593 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.627494 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-kube-api-access-bw4lz" (OuterVolumeSpecName: "kube-api-access-bw4lz") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "kube-api-access-bw4lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.644386 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.647070 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-inventory" (OuterVolumeSpecName: "inventory") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.652607 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.656071 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.660481 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.668340 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.668357 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" (UID: "da3d07f3-8fb0-4ab3-a350-ad5b2a09af97"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714571 4669 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714634 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714655 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw4lz\" (UniqueName: \"kubernetes.io/projected/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-kube-api-access-bw4lz\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714679 4669 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714698 4669 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714716 4669 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714740 4669 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714758 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.714778 4669 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da3d07f3-8fb0-4ab3-a350-ad5b2a09af97-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.967733 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" event={"ID":"da3d07f3-8fb0-4ab3-a350-ad5b2a09af97","Type":"ContainerDied","Data":"283ce9f928ea79fced274582021727b1a906a3e6d3f147a6c38e773d84a0a6cd"} Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.967814 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="283ce9f928ea79fced274582021727b1a906a3e6d3f147a6c38e773d84a0a6cd" Oct 01 12:12:41 crc kubenswrapper[4669]: I1001 12:12:41.967924 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zr89n" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.101634 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl"] Oct 01 12:12:42 crc kubenswrapper[4669]: E1001 12:12:42.102210 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.102235 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.102549 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3d07f3-8fb0-4ab3-a350-ad5b2a09af97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.103492 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.107813 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvgp5" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.108025 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.108243 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.109580 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.109687 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.117124 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl"] Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.226901 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.226983 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.227107 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk5th\" (UniqueName: \"kubernetes.io/projected/d1966594-3c43-4ecf-a982-fc851d0bb43b-kube-api-access-kk5th\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.227233 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.227549 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.227624 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.227859 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.330936 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.331180 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.331242 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.331423 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.331549 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.331663 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.331862 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk5th\" (UniqueName: \"kubernetes.io/projected/d1966594-3c43-4ecf-a982-fc851d0bb43b-kube-api-access-kk5th\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.336732 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.336808 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.337798 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.339172 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.340984 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.342886 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.353018 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk5th\" (UniqueName: \"kubernetes.io/projected/d1966594-3c43-4ecf-a982-fc851d0bb43b-kube-api-access-kk5th\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.443536 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:12:42 crc kubenswrapper[4669]: I1001 12:12:42.981809 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl"] Oct 01 12:12:43 crc kubenswrapper[4669]: I1001 12:12:43.989367 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" event={"ID":"d1966594-3c43-4ecf-a982-fc851d0bb43b","Type":"ContainerStarted","Data":"9ae656aae8e3678a83c7ef80aa6769b7215f365c224185ec0f07229ed0d8cfa3"} Oct 01 12:12:43 crc kubenswrapper[4669]: I1001 12:12:43.990280 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" event={"ID":"d1966594-3c43-4ecf-a982-fc851d0bb43b","Type":"ContainerStarted","Data":"26d8af4a4450f51ac2ad24aed15f80f9e49f6c3ca7440089441e29400febea68"} Oct 01 12:12:44 crc kubenswrapper[4669]: I1001 12:12:44.013067 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" podStartSLOduration=1.366682498 podStartE2EDuration="2.013037991s" podCreationTimestamp="2025-10-01 12:12:42 +0000 UTC" firstStartedPulling="2025-10-01 12:12:42.99212841 +0000 UTC m=+2654.091693387" lastFinishedPulling="2025-10-01 12:12:43.638483903 +0000 UTC m=+2654.738048880" observedRunningTime="2025-10-01 12:12:44.010362585 +0000 UTC m=+2655.109927562" watchObservedRunningTime="2025-10-01 12:12:44.013037991 +0000 UTC m=+2655.112602968" Oct 01 12:12:52 crc kubenswrapper[4669]: I1001 12:12:52.645582 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:12:52 crc kubenswrapper[4669]: E1001 12:12:52.649572 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:13:03 crc kubenswrapper[4669]: I1001 12:13:03.645255 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:13:03 crc kubenswrapper[4669]: E1001 12:13:03.646269 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:13:16 crc kubenswrapper[4669]: I1001 12:13:16.645524 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:13:16 crc kubenswrapper[4669]: E1001 12:13:16.646904 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:13:29 crc kubenswrapper[4669]: I1001 12:13:29.656755 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:13:29 crc kubenswrapper[4669]: E1001 12:13:29.658024 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:13:41 crc kubenswrapper[4669]: I1001 12:13:41.644057 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:13:42 crc kubenswrapper[4669]: I1001 12:13:42.670862 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"9de938d4db4a6b2603c7c88233b8827064d7ec05a2d9ca7acc89aebe3d5259a4"} Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.077375 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27xp6"] Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.080209 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.111128 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-utilities\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.111234 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-catalog-content\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.111330 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxfg6\" (UniqueName: \"kubernetes.io/projected/943e55d0-a79d-4712-b400-5ea39dadcbe0-kube-api-access-mxfg6\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.135122 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xp6"] Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.213694 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxfg6\" (UniqueName: \"kubernetes.io/projected/943e55d0-a79d-4712-b400-5ea39dadcbe0-kube-api-access-mxfg6\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.213776 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-utilities\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.213835 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-catalog-content\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.214345 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-catalog-content\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.214707 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-utilities\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.238979 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxfg6\" (UniqueName: \"kubernetes.io/projected/943e55d0-a79d-4712-b400-5ea39dadcbe0-kube-api-access-mxfg6\") pod \"redhat-marketplace-27xp6\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.402631 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:13:50 crc kubenswrapper[4669]: I1001 12:13:50.890496 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xp6"] Oct 01 12:13:51 crc kubenswrapper[4669]: I1001 12:13:51.786447 4669 generic.go:334] "Generic (PLEG): container finished" podID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerID="951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a" exitCode=0 Oct 01 12:13:51 crc kubenswrapper[4669]: I1001 12:13:51.786632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xp6" event={"ID":"943e55d0-a79d-4712-b400-5ea39dadcbe0","Type":"ContainerDied","Data":"951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a"} Oct 01 12:13:51 crc kubenswrapper[4669]: I1001 12:13:51.787325 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xp6" event={"ID":"943e55d0-a79d-4712-b400-5ea39dadcbe0","Type":"ContainerStarted","Data":"96c346481f1af3fd7357f701f25faa350dba79dcdfd38bb0fd496daf48726988"} Oct 01 12:13:51 crc kubenswrapper[4669]: I1001 12:13:51.790536 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:13:53 crc kubenswrapper[4669]: I1001 12:13:53.809175 4669 generic.go:334] "Generic (PLEG): container finished" podID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerID="1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e" exitCode=0 Oct 01 12:13:53 crc kubenswrapper[4669]: I1001 12:13:53.809262 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xp6" event={"ID":"943e55d0-a79d-4712-b400-5ea39dadcbe0","Type":"ContainerDied","Data":"1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e"} Oct 01 12:13:55 crc kubenswrapper[4669]: I1001 12:13:55.836001 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xp6" event={"ID":"943e55d0-a79d-4712-b400-5ea39dadcbe0","Type":"ContainerStarted","Data":"c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203"} Oct 01 12:13:55 crc kubenswrapper[4669]: I1001 12:13:55.863692 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27xp6" podStartSLOduration=3.039171678 podStartE2EDuration="5.863668074s" podCreationTimestamp="2025-10-01 12:13:50 +0000 UTC" firstStartedPulling="2025-10-01 12:13:51.790183631 +0000 UTC m=+2722.889748618" lastFinishedPulling="2025-10-01 12:13:54.614680037 +0000 UTC m=+2725.714245014" observedRunningTime="2025-10-01 12:13:55.854896048 +0000 UTC m=+2726.954461035" watchObservedRunningTime="2025-10-01 12:13:55.863668074 +0000 UTC m=+2726.963233051" Oct 01 12:14:00 crc kubenswrapper[4669]: I1001 12:14:00.403624 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:14:00 crc kubenswrapper[4669]: I1001 12:14:00.404527 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:14:00 crc kubenswrapper[4669]: I1001 12:14:00.486024 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:14:00 crc kubenswrapper[4669]: I1001 12:14:00.967929 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:14:01 crc kubenswrapper[4669]: I1001 12:14:01.042416 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xp6"] Oct 01 12:14:02 crc kubenswrapper[4669]: I1001 12:14:02.917053 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27xp6" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="registry-server" containerID="cri-o://c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203" gracePeriod=2 Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.395503 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.510169 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-utilities\") pod \"943e55d0-a79d-4712-b400-5ea39dadcbe0\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.510889 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-catalog-content\") pod \"943e55d0-a79d-4712-b400-5ea39dadcbe0\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.510945 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxfg6\" (UniqueName: \"kubernetes.io/projected/943e55d0-a79d-4712-b400-5ea39dadcbe0-kube-api-access-mxfg6\") pod \"943e55d0-a79d-4712-b400-5ea39dadcbe0\" (UID: \"943e55d0-a79d-4712-b400-5ea39dadcbe0\") " Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.511365 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-utilities" (OuterVolumeSpecName: "utilities") pod "943e55d0-a79d-4712-b400-5ea39dadcbe0" (UID: "943e55d0-a79d-4712-b400-5ea39dadcbe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.511956 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.522398 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943e55d0-a79d-4712-b400-5ea39dadcbe0-kube-api-access-mxfg6" (OuterVolumeSpecName: "kube-api-access-mxfg6") pod "943e55d0-a79d-4712-b400-5ea39dadcbe0" (UID: "943e55d0-a79d-4712-b400-5ea39dadcbe0"). InnerVolumeSpecName "kube-api-access-mxfg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.524578 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "943e55d0-a79d-4712-b400-5ea39dadcbe0" (UID: "943e55d0-a79d-4712-b400-5ea39dadcbe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.615320 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxfg6\" (UniqueName: \"kubernetes.io/projected/943e55d0-a79d-4712-b400-5ea39dadcbe0-kube-api-access-mxfg6\") on node \"crc\" DevicePath \"\"" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.615361 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943e55d0-a79d-4712-b400-5ea39dadcbe0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.928364 4669 generic.go:334] "Generic (PLEG): container finished" podID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerID="c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203" exitCode=0 Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.928415 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xp6" event={"ID":"943e55d0-a79d-4712-b400-5ea39dadcbe0","Type":"ContainerDied","Data":"c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203"} Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.928450 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27xp6" event={"ID":"943e55d0-a79d-4712-b400-5ea39dadcbe0","Type":"ContainerDied","Data":"96c346481f1af3fd7357f701f25faa350dba79dcdfd38bb0fd496daf48726988"} Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.928467 4669 scope.go:117] "RemoveContainer" containerID="c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.928462 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27xp6" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.953318 4669 scope.go:117] "RemoveContainer" containerID="1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e" Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.956352 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xp6"] Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.968589 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27xp6"] Oct 01 12:14:03 crc kubenswrapper[4669]: I1001 12:14:03.975458 4669 scope.go:117] "RemoveContainer" containerID="951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a" Oct 01 12:14:04 crc kubenswrapper[4669]: I1001 12:14:04.019052 4669 scope.go:117] "RemoveContainer" containerID="c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203" Oct 01 12:14:04 crc kubenswrapper[4669]: E1001 12:14:04.019499 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203\": container with ID starting with c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203 not found: ID does not exist" containerID="c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203" Oct 01 12:14:04 crc kubenswrapper[4669]: I1001 12:14:04.019567 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203"} err="failed to get container status \"c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203\": rpc error: code = NotFound desc = could not find container \"c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203\": container with ID starting with c3120ca4ff5eb337c9463a97171e31581d5acc715a8a1d839df66b1e8396b203 not found: ID does not exist" Oct 01 12:14:04 crc kubenswrapper[4669]: I1001 12:14:04.019599 4669 scope.go:117] "RemoveContainer" containerID="1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e" Oct 01 12:14:04 crc kubenswrapper[4669]: E1001 12:14:04.020094 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e\": container with ID starting with 1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e not found: ID does not exist" containerID="1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e" Oct 01 12:14:04 crc kubenswrapper[4669]: I1001 12:14:04.020126 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e"} err="failed to get container status \"1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e\": rpc error: code = NotFound desc = could not find container \"1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e\": container with ID starting with 1971311baa7cfa7797721f35ac529773fdc0e09feb972510002a0e6f6abe106e not found: ID does not exist" Oct 01 12:14:04 crc kubenswrapper[4669]: I1001 12:14:04.020141 4669 scope.go:117] "RemoveContainer" containerID="951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a" Oct 01 12:14:04 crc kubenswrapper[4669]: E1001 12:14:04.020540 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a\": container with ID starting with 951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a not found: ID does not exist" containerID="951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a" Oct 01 12:14:04 crc kubenswrapper[4669]: I1001 12:14:04.020610 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a"} err="failed to get container status \"951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a\": rpc error: code = NotFound desc = could not find container \"951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a\": container with ID starting with 951e211a98cc3c641259c6365d5f601f812aae6563ee9af5a92079d1ee64923a not found: ID does not exist" Oct 01 12:14:05 crc kubenswrapper[4669]: I1001 12:14:05.657263 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" path="/var/lib/kubelet/pods/943e55d0-a79d-4712-b400-5ea39dadcbe0/volumes" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.188310 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9k9qw"] Oct 01 12:14:49 crc kubenswrapper[4669]: E1001 12:14:49.189599 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="extract-utilities" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.189618 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="extract-utilities" Oct 01 12:14:49 crc kubenswrapper[4669]: E1001 12:14:49.189655 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="extract-content" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.189663 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="extract-content" Oct 01 12:14:49 crc kubenswrapper[4669]: E1001 12:14:49.189708 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="registry-server" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.189716 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="registry-server" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.189976 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="943e55d0-a79d-4712-b400-5ea39dadcbe0" containerName="registry-server" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.193215 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.228768 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9k9qw"] Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.320281 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-utilities\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.320480 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghzq\" (UniqueName: \"kubernetes.io/projected/8d80260c-d8f7-4714-b843-cb711248fbdd-kube-api-access-jghzq\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.320712 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-catalog-content\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.422851 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-utilities\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.422932 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghzq\" (UniqueName: \"kubernetes.io/projected/8d80260c-d8f7-4714-b843-cb711248fbdd-kube-api-access-jghzq\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.423029 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-catalog-content\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.424297 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-catalog-content\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.426756 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-utilities\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.454687 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghzq\" (UniqueName: \"kubernetes.io/projected/8d80260c-d8f7-4714-b843-cb711248fbdd-kube-api-access-jghzq\") pod \"certified-operators-9k9qw\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:49 crc kubenswrapper[4669]: I1001 12:14:49.537894 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:50 crc kubenswrapper[4669]: I1001 12:14:50.070410 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9k9qw"] Oct 01 12:14:50 crc kubenswrapper[4669]: I1001 12:14:50.429439 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerStarted","Data":"d57903b25cc4cea2ba888ab232737dc9f96234ba3c80d22c7af58673569407c0"} Oct 01 12:14:51 crc kubenswrapper[4669]: I1001 12:14:51.450993 4669 generic.go:334] "Generic (PLEG): container finished" podID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerID="37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165" exitCode=0 Oct 01 12:14:51 crc kubenswrapper[4669]: I1001 12:14:51.451061 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerDied","Data":"37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165"} Oct 01 12:14:52 crc kubenswrapper[4669]: I1001 12:14:52.463047 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerStarted","Data":"f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a"} Oct 01 12:14:53 crc kubenswrapper[4669]: I1001 12:14:53.474139 4669 generic.go:334] "Generic (PLEG): container finished" podID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerID="f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a" exitCode=0 Oct 01 12:14:53 crc kubenswrapper[4669]: I1001 12:14:53.474465 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerDied","Data":"f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a"} Oct 01 12:14:54 crc kubenswrapper[4669]: I1001 12:14:54.510741 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerStarted","Data":"3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840"} Oct 01 12:14:54 crc kubenswrapper[4669]: I1001 12:14:54.543173 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9k9qw" podStartSLOduration=2.860829487 podStartE2EDuration="5.543144368s" podCreationTimestamp="2025-10-01 12:14:49 +0000 UTC" firstStartedPulling="2025-10-01 12:14:51.453256684 +0000 UTC m=+2782.552821671" lastFinishedPulling="2025-10-01 12:14:54.135571575 +0000 UTC m=+2785.235136552" observedRunningTime="2025-10-01 12:14:54.536650747 +0000 UTC m=+2785.636215744" watchObservedRunningTime="2025-10-01 12:14:54.543144368 +0000 UTC m=+2785.642709355" Oct 01 12:14:59 crc kubenswrapper[4669]: I1001 12:14:59.538687 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:59 crc kubenswrapper[4669]: I1001 12:14:59.539691 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:59 crc kubenswrapper[4669]: I1001 12:14:59.585572 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:59 crc kubenswrapper[4669]: I1001 12:14:59.665002 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:14:59 crc kubenswrapper[4669]: I1001 12:14:59.827803 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9k9qw"] Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.154720 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7"] Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.156400 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.160417 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.160644 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.168199 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7"] Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.196929 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq8sl\" (UniqueName: \"kubernetes.io/projected/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-kube-api-access-hq8sl\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.197059 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-config-volume\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.197145 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-secret-volume\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.298798 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-config-volume\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.298965 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-secret-volume\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.299134 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq8sl\" (UniqueName: \"kubernetes.io/projected/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-kube-api-access-hq8sl\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.299758 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-config-volume\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.312493 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-secret-volume\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.318763 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq8sl\" (UniqueName: \"kubernetes.io/projected/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-kube-api-access-hq8sl\") pod \"collect-profiles-29322015-tm9p7\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:00 crc kubenswrapper[4669]: I1001 12:15:00.495802 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:01 crc kubenswrapper[4669]: I1001 12:15:01.006933 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7"] Oct 01 12:15:01 crc kubenswrapper[4669]: I1001 12:15:01.596687 4669 generic.go:334] "Generic (PLEG): container finished" podID="d5f95d61-b7a4-4c81-b84c-f82513d75a4f" containerID="9e6d620f85b189e7bad838604b509ebede4f612e4e1a35393a6cfdf653ac03e3" exitCode=0 Oct 01 12:15:01 crc kubenswrapper[4669]: I1001 12:15:01.596757 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" event={"ID":"d5f95d61-b7a4-4c81-b84c-f82513d75a4f","Type":"ContainerDied","Data":"9e6d620f85b189e7bad838604b509ebede4f612e4e1a35393a6cfdf653ac03e3"} Oct 01 12:15:01 crc kubenswrapper[4669]: I1001 12:15:01.597503 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" event={"ID":"d5f95d61-b7a4-4c81-b84c-f82513d75a4f","Type":"ContainerStarted","Data":"187947cc499fc6e0ddca83d504bca4abec31621a1d39b40de82c175d830a6a67"} Oct 01 12:15:01 crc kubenswrapper[4669]: I1001 12:15:01.597803 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9k9qw" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="registry-server" containerID="cri-o://3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840" gracePeriod=2 Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.047556 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.140521 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-catalog-content\") pod \"8d80260c-d8f7-4714-b843-cb711248fbdd\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.140899 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jghzq\" (UniqueName: \"kubernetes.io/projected/8d80260c-d8f7-4714-b843-cb711248fbdd-kube-api-access-jghzq\") pod \"8d80260c-d8f7-4714-b843-cb711248fbdd\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.141227 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-utilities\") pod \"8d80260c-d8f7-4714-b843-cb711248fbdd\" (UID: \"8d80260c-d8f7-4714-b843-cb711248fbdd\") " Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.142437 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-utilities" (OuterVolumeSpecName: "utilities") pod "8d80260c-d8f7-4714-b843-cb711248fbdd" (UID: "8d80260c-d8f7-4714-b843-cb711248fbdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.151743 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d80260c-d8f7-4714-b843-cb711248fbdd-kube-api-access-jghzq" (OuterVolumeSpecName: "kube-api-access-jghzq") pod "8d80260c-d8f7-4714-b843-cb711248fbdd" (UID: "8d80260c-d8f7-4714-b843-cb711248fbdd"). InnerVolumeSpecName "kube-api-access-jghzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.222868 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d80260c-d8f7-4714-b843-cb711248fbdd" (UID: "8d80260c-d8f7-4714-b843-cb711248fbdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.244141 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.244191 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d80260c-d8f7-4714-b843-cb711248fbdd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.244216 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jghzq\" (UniqueName: \"kubernetes.io/projected/8d80260c-d8f7-4714-b843-cb711248fbdd-kube-api-access-jghzq\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.613589 4669 generic.go:334] "Generic (PLEG): container finished" podID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerID="3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840" exitCode=0 Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.613667 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerDied","Data":"3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840"} Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.613728 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k9qw" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.613764 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k9qw" event={"ID":"8d80260c-d8f7-4714-b843-cb711248fbdd","Type":"ContainerDied","Data":"d57903b25cc4cea2ba888ab232737dc9f96234ba3c80d22c7af58673569407c0"} Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.613795 4669 scope.go:117] "RemoveContainer" containerID="3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.650963 4669 scope.go:117] "RemoveContainer" containerID="f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.662542 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9k9qw"] Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.671306 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9k9qw"] Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.687017 4669 scope.go:117] "RemoveContainer" containerID="37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.770462 4669 scope.go:117] "RemoveContainer" containerID="3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840" Oct 01 12:15:02 crc kubenswrapper[4669]: E1001 12:15:02.774264 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840\": container with ID starting with 3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840 not found: ID does not exist" containerID="3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.774355 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840"} err="failed to get container status \"3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840\": rpc error: code = NotFound desc = could not find container \"3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840\": container with ID starting with 3d4d08b0975cff057e1bc28964c326fa3cc073e92c6017e3976952ebfe866840 not found: ID does not exist" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.774412 4669 scope.go:117] "RemoveContainer" containerID="f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a" Oct 01 12:15:02 crc kubenswrapper[4669]: E1001 12:15:02.775336 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a\": container with ID starting with f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a not found: ID does not exist" containerID="f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.775392 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a"} err="failed to get container status \"f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a\": rpc error: code = NotFound desc = could not find container \"f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a\": container with ID starting with f3563fc6fbb5ed3a54f127ef0c34b5fc195310327cc29f07443f85d4301c2f0a not found: ID does not exist" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.775426 4669 scope.go:117] "RemoveContainer" containerID="37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165" Oct 01 12:15:02 crc kubenswrapper[4669]: E1001 12:15:02.782482 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165\": container with ID starting with 37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165 not found: ID does not exist" containerID="37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165" Oct 01 12:15:02 crc kubenswrapper[4669]: I1001 12:15:02.782541 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165"} err="failed to get container status \"37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165\": rpc error: code = NotFound desc = could not find container \"37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165\": container with ID starting with 37045234d942d2bc199e6686e166db966690421e1d96bbf900fc68a602ea4165 not found: ID does not exist" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.052120 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.164861 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-secret-volume\") pod \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.165183 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq8sl\" (UniqueName: \"kubernetes.io/projected/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-kube-api-access-hq8sl\") pod \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.165355 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-config-volume\") pod \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\" (UID: \"d5f95d61-b7a4-4c81-b84c-f82513d75a4f\") " Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.165927 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5f95d61-b7a4-4c81-b84c-f82513d75a4f" (UID: "d5f95d61-b7a4-4c81-b84c-f82513d75a4f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.166407 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.170995 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-kube-api-access-hq8sl" (OuterVolumeSpecName: "kube-api-access-hq8sl") pod "d5f95d61-b7a4-4c81-b84c-f82513d75a4f" (UID: "d5f95d61-b7a4-4c81-b84c-f82513d75a4f"). InnerVolumeSpecName "kube-api-access-hq8sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.172218 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5f95d61-b7a4-4c81-b84c-f82513d75a4f" (UID: "d5f95d61-b7a4-4c81-b84c-f82513d75a4f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.268676 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.269110 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq8sl\" (UniqueName: \"kubernetes.io/projected/d5f95d61-b7a4-4c81-b84c-f82513d75a4f-kube-api-access-hq8sl\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.627573 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" event={"ID":"d5f95d61-b7a4-4c81-b84c-f82513d75a4f","Type":"ContainerDied","Data":"187947cc499fc6e0ddca83d504bca4abec31621a1d39b40de82c175d830a6a67"} Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.627626 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="187947cc499fc6e0ddca83d504bca4abec31621a1d39b40de82c175d830a6a67" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.627636 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322015-tm9p7" Oct 01 12:15:03 crc kubenswrapper[4669]: I1001 12:15:03.661593 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" path="/var/lib/kubelet/pods/8d80260c-d8f7-4714-b843-cb711248fbdd/volumes" Oct 01 12:15:04 crc kubenswrapper[4669]: I1001 12:15:04.144217 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk"] Oct 01 12:15:04 crc kubenswrapper[4669]: I1001 12:15:04.155824 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321970-cpbkk"] Oct 01 12:15:05 crc kubenswrapper[4669]: I1001 12:15:05.657446 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed84780-32e8-41fe-a20d-4c7a633ee541" path="/var/lib/kubelet/pods/0ed84780-32e8-41fe-a20d-4c7a633ee541/volumes" Oct 01 12:15:31 crc kubenswrapper[4669]: I1001 12:15:31.994815 4669 generic.go:334] "Generic (PLEG): container finished" podID="d1966594-3c43-4ecf-a982-fc851d0bb43b" containerID="9ae656aae8e3678a83c7ef80aa6769b7215f365c224185ec0f07229ed0d8cfa3" exitCode=0 Oct 01 12:15:31 crc kubenswrapper[4669]: I1001 12:15:31.994965 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" event={"ID":"d1966594-3c43-4ecf-a982-fc851d0bb43b","Type":"ContainerDied","Data":"9ae656aae8e3678a83c7ef80aa6769b7215f365c224185ec0f07229ed0d8cfa3"} Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.472729 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610343 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk5th\" (UniqueName: \"kubernetes.io/projected/d1966594-3c43-4ecf-a982-fc851d0bb43b-kube-api-access-kk5th\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610411 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-1\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610538 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-telemetry-combined-ca-bundle\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610567 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-inventory\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610591 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-2\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610626 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-0\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.610677 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ssh-key\") pod \"d1966594-3c43-4ecf-a982-fc851d0bb43b\" (UID: \"d1966594-3c43-4ecf-a982-fc851d0bb43b\") " Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.639099 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1966594-3c43-4ecf-a982-fc851d0bb43b-kube-api-access-kk5th" (OuterVolumeSpecName: "kube-api-access-kk5th") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "kube-api-access-kk5th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.640202 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.643661 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-inventory" (OuterVolumeSpecName: "inventory") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.646197 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.652517 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.663011 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.663065 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d1966594-3c43-4ecf-a982-fc851d0bb43b" (UID: "d1966594-3c43-4ecf-a982-fc851d0bb43b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714139 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714197 4669 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714218 4669 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714234 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714254 4669 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714271 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1966594-3c43-4ecf-a982-fc851d0bb43b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:33 crc kubenswrapper[4669]: I1001 12:15:33.714289 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk5th\" (UniqueName: \"kubernetes.io/projected/d1966594-3c43-4ecf-a982-fc851d0bb43b-kube-api-access-kk5th\") on node \"crc\" DevicePath \"\"" Oct 01 12:15:34 crc kubenswrapper[4669]: I1001 12:15:34.019014 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" event={"ID":"d1966594-3c43-4ecf-a982-fc851d0bb43b","Type":"ContainerDied","Data":"26d8af4a4450f51ac2ad24aed15f80f9e49f6c3ca7440089441e29400febea68"} Oct 01 12:15:34 crc kubenswrapper[4669]: I1001 12:15:34.019116 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26d8af4a4450f51ac2ad24aed15f80f9e49f6c3ca7440089441e29400febea68" Oct 01 12:15:34 crc kubenswrapper[4669]: I1001 12:15:34.019210 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl" Oct 01 12:15:36 crc kubenswrapper[4669]: I1001 12:15:36.307494 4669 scope.go:117] "RemoveContainer" containerID="b2ef4c3a87df0e2cf56a0fd6b2e309deb960a8c375c556503bb0dbc4459a544d" Oct 01 12:16:01 crc kubenswrapper[4669]: I1001 12:16:01.863309 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:16:01 crc kubenswrapper[4669]: I1001 12:16:01.864093 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.761781 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6g6s8"] Oct 01 12:16:08 crc kubenswrapper[4669]: E1001 12:16:08.763131 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1966594-3c43-4ecf-a982-fc851d0bb43b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763149 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1966594-3c43-4ecf-a982-fc851d0bb43b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 12:16:08 crc kubenswrapper[4669]: E1001 12:16:08.763169 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="registry-server" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763178 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="registry-server" Oct 01 12:16:08 crc kubenswrapper[4669]: E1001 12:16:08.763205 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="extract-utilities" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763216 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="extract-utilities" Oct 01 12:16:08 crc kubenswrapper[4669]: E1001 12:16:08.763231 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f95d61-b7a4-4c81-b84c-f82513d75a4f" containerName="collect-profiles" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763239 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f95d61-b7a4-4c81-b84c-f82513d75a4f" containerName="collect-profiles" Oct 01 12:16:08 crc kubenswrapper[4669]: E1001 12:16:08.763261 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="extract-content" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763270 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="extract-content" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763502 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d80260c-d8f7-4714-b843-cb711248fbdd" containerName="registry-server" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763520 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f95d61-b7a4-4c81-b84c-f82513d75a4f" containerName="collect-profiles" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.763533 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1966594-3c43-4ecf-a982-fc851d0bb43b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.765409 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.773518 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6g6s8"] Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.906194 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-catalog-content\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.906251 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-utilities\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:08 crc kubenswrapper[4669]: I1001 12:16:08.906334 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwtn\" (UniqueName: \"kubernetes.io/projected/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-kube-api-access-6hwtn\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.008616 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwtn\" (UniqueName: \"kubernetes.io/projected/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-kube-api-access-6hwtn\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.009205 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-catalog-content\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.009229 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-utilities\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.009739 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-utilities\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.010456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-catalog-content\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.045455 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwtn\" (UniqueName: \"kubernetes.io/projected/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-kube-api-access-6hwtn\") pod \"community-operators-6g6s8\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.129767 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:09 crc kubenswrapper[4669]: I1001 12:16:09.708692 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6g6s8"] Oct 01 12:16:09 crc kubenswrapper[4669]: W1001 12:16:09.714738 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a2f96fd_3d90_4db7_8726_8c0d14fd4de0.slice/crio-771802ffc0144e5096a1e29c69b2bc3bf6a9a316a7127db217ee8d6c4c69b0b4 WatchSource:0}: Error finding container 771802ffc0144e5096a1e29c69b2bc3bf6a9a316a7127db217ee8d6c4c69b0b4: Status 404 returned error can't find the container with id 771802ffc0144e5096a1e29c69b2bc3bf6a9a316a7127db217ee8d6c4c69b0b4 Oct 01 12:16:10 crc kubenswrapper[4669]: I1001 12:16:10.418457 4669 generic.go:334] "Generic (PLEG): container finished" podID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerID="40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21" exitCode=0 Oct 01 12:16:10 crc kubenswrapper[4669]: I1001 12:16:10.418561 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g6s8" event={"ID":"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0","Type":"ContainerDied","Data":"40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21"} Oct 01 12:16:10 crc kubenswrapper[4669]: I1001 12:16:10.419028 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g6s8" event={"ID":"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0","Type":"ContainerStarted","Data":"771802ffc0144e5096a1e29c69b2bc3bf6a9a316a7127db217ee8d6c4c69b0b4"} Oct 01 12:16:12 crc kubenswrapper[4669]: I1001 12:16:12.459414 4669 generic.go:334] "Generic (PLEG): container finished" podID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerID="5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0" exitCode=0 Oct 01 12:16:12 crc kubenswrapper[4669]: I1001 12:16:12.459569 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g6s8" event={"ID":"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0","Type":"ContainerDied","Data":"5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0"} Oct 01 12:16:13 crc kubenswrapper[4669]: I1001 12:16:13.472321 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g6s8" event={"ID":"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0","Type":"ContainerStarted","Data":"779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e"} Oct 01 12:16:13 crc kubenswrapper[4669]: I1001 12:16:13.496248 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6g6s8" podStartSLOduration=3.048017149 podStartE2EDuration="5.49622558s" podCreationTimestamp="2025-10-01 12:16:08 +0000 UTC" firstStartedPulling="2025-10-01 12:16:10.421673585 +0000 UTC m=+2861.521238562" lastFinishedPulling="2025-10-01 12:16:12.869882016 +0000 UTC m=+2863.969446993" observedRunningTime="2025-10-01 12:16:13.491852942 +0000 UTC m=+2864.591417919" watchObservedRunningTime="2025-10-01 12:16:13.49622558 +0000 UTC m=+2864.595790577" Oct 01 12:16:19 crc kubenswrapper[4669]: I1001 12:16:19.130351 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:19 crc kubenswrapper[4669]: I1001 12:16:19.131224 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:19 crc kubenswrapper[4669]: I1001 12:16:19.197403 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:19 crc kubenswrapper[4669]: I1001 12:16:19.609375 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:19 crc kubenswrapper[4669]: I1001 12:16:19.686902 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6g6s8"] Oct 01 12:16:21 crc kubenswrapper[4669]: I1001 12:16:21.555809 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6g6s8" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="registry-server" containerID="cri-o://779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e" gracePeriod=2 Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.039844 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.156918 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwtn\" (UniqueName: \"kubernetes.io/projected/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-kube-api-access-6hwtn\") pod \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.157102 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-utilities\") pod \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.157159 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-catalog-content\") pod \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\" (UID: \"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0\") " Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.159131 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-utilities" (OuterVolumeSpecName: "utilities") pod "9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" (UID: "9a2f96fd-3d90-4db7-8726-8c0d14fd4de0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.166806 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-kube-api-access-6hwtn" (OuterVolumeSpecName: "kube-api-access-6hwtn") pod "9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" (UID: "9a2f96fd-3d90-4db7-8726-8c0d14fd4de0"). InnerVolumeSpecName "kube-api-access-6hwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.222422 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" (UID: "9a2f96fd-3d90-4db7-8726-8c0d14fd4de0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.259849 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.260230 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.260251 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hwtn\" (UniqueName: \"kubernetes.io/projected/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0-kube-api-access-6hwtn\") on node \"crc\" DevicePath \"\"" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.569576 4669 generic.go:334] "Generic (PLEG): container finished" podID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerID="779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e" exitCode=0 Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.569634 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g6s8" event={"ID":"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0","Type":"ContainerDied","Data":"779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e"} Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.569669 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g6s8" event={"ID":"9a2f96fd-3d90-4db7-8726-8c0d14fd4de0","Type":"ContainerDied","Data":"771802ffc0144e5096a1e29c69b2bc3bf6a9a316a7127db217ee8d6c4c69b0b4"} Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.569691 4669 scope.go:117] "RemoveContainer" containerID="779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.569717 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g6s8" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.615893 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6g6s8"] Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.616959 4669 scope.go:117] "RemoveContainer" containerID="5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.626773 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6g6s8"] Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.645112 4669 scope.go:117] "RemoveContainer" containerID="40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.700695 4669 scope.go:117] "RemoveContainer" containerID="779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e" Oct 01 12:16:22 crc kubenswrapper[4669]: E1001 12:16:22.701285 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e\": container with ID starting with 779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e not found: ID does not exist" containerID="779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.701369 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e"} err="failed to get container status \"779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e\": rpc error: code = NotFound desc = could not find container \"779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e\": container with ID starting with 779b6a9b78569d6bebf198f31c7da1c336319a1bc2ee4b11e232fee63a99888e not found: ID does not exist" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.701415 4669 scope.go:117] "RemoveContainer" containerID="5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0" Oct 01 12:16:22 crc kubenswrapper[4669]: E1001 12:16:22.701912 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0\": container with ID starting with 5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0 not found: ID does not exist" containerID="5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.701968 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0"} err="failed to get container status \"5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0\": rpc error: code = NotFound desc = could not find container \"5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0\": container with ID starting with 5a6ec614b55ffcf8383f6a0f29e61ac0f5d0d79837ed9206dadd58893b9861d0 not found: ID does not exist" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.702005 4669 scope.go:117] "RemoveContainer" containerID="40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21" Oct 01 12:16:22 crc kubenswrapper[4669]: E1001 12:16:22.702539 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21\": container with ID starting with 40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21 not found: ID does not exist" containerID="40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21" Oct 01 12:16:22 crc kubenswrapper[4669]: I1001 12:16:22.702574 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21"} err="failed to get container status \"40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21\": rpc error: code = NotFound desc = could not find container \"40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21\": container with ID starting with 40ea9fa62a4a576a75452485d7d0f6fda07f2fd3aaa1f59a8cd4ab2a73d88e21 not found: ID does not exist" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.085882 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 12:16:23 crc kubenswrapper[4669]: E1001 12:16:23.086912 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="registry-server" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.086952 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="registry-server" Oct 01 12:16:23 crc kubenswrapper[4669]: E1001 12:16:23.086977 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="extract-utilities" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.086994 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="extract-utilities" Oct 01 12:16:23 crc kubenswrapper[4669]: E1001 12:16:23.087047 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="extract-content" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.087060 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="extract-content" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.087443 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" containerName="registry-server" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.088748 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.091757 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.091962 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6zxgp" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.092261 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.097228 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.126181 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.181837 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182036 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182359 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182421 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182492 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-config-data\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182649 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182718 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzg9\" (UniqueName: \"kubernetes.io/projected/fce73f67-b429-4b4a-b873-a45f92d104c7-kube-api-access-phzg9\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.182914 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.183108 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.285576 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.285726 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.285829 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.286042 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.286575 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.286653 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.287108 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.287182 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-config-data\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.287262 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.287313 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzg9\" (UniqueName: \"kubernetes.io/projected/fce73f67-b429-4b4a-b873-a45f92d104c7-kube-api-access-phzg9\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.287558 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.287720 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.288656 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.291310 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-config-data\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.295446 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.295966 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.296830 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.324865 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzg9\" (UniqueName: \"kubernetes.io/projected/fce73f67-b429-4b4a-b873-a45f92d104c7-kube-api-access-phzg9\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.327342 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.425429 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.656741 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a2f96fd-3d90-4db7-8726-8c0d14fd4de0" path="/var/lib/kubelet/pods/9a2f96fd-3d90-4db7-8726-8c0d14fd4de0/volumes" Oct 01 12:16:23 crc kubenswrapper[4669]: I1001 12:16:23.780298 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 12:16:24 crc kubenswrapper[4669]: I1001 12:16:24.614647 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fce73f67-b429-4b4a-b873-a45f92d104c7","Type":"ContainerStarted","Data":"85ceade58cb0c4740064a6b19afc7ce29bdeebb3747ae5e42f2e634a68ad6a73"} Oct 01 12:16:28 crc kubenswrapper[4669]: I1001 12:16:28.941794 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9pp96"] Oct 01 12:16:28 crc kubenswrapper[4669]: I1001 12:16:28.944740 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:28 crc kubenswrapper[4669]: I1001 12:16:28.962248 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pp96"] Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.026134 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-catalog-content\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.026535 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-287hz\" (UniqueName: \"kubernetes.io/projected/8223ffe0-afa2-4e11-836e-5361c5d265dd-kube-api-access-287hz\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.026618 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-utilities\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.129279 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-catalog-content\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.129768 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-287hz\" (UniqueName: \"kubernetes.io/projected/8223ffe0-afa2-4e11-836e-5361c5d265dd-kube-api-access-287hz\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.129802 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-utilities\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.129818 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-catalog-content\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.130274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-utilities\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.169034 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-287hz\" (UniqueName: \"kubernetes.io/projected/8223ffe0-afa2-4e11-836e-5361c5d265dd-kube-api-access-287hz\") pod \"redhat-operators-9pp96\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.291022 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:16:29 crc kubenswrapper[4669]: I1001 12:16:29.821131 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9pp96"] Oct 01 12:16:30 crc kubenswrapper[4669]: I1001 12:16:30.689449 4669 generic.go:334] "Generic (PLEG): container finished" podID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerID="70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9" exitCode=0 Oct 01 12:16:30 crc kubenswrapper[4669]: I1001 12:16:30.689551 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerDied","Data":"70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9"} Oct 01 12:16:30 crc kubenswrapper[4669]: I1001 12:16:30.689843 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerStarted","Data":"b1485dcf5795b57a70c656845669dba4327cb5e3ae3c3042eb4b8a89defaee6e"} Oct 01 12:16:31 crc kubenswrapper[4669]: I1001 12:16:31.863646 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:16:31 crc kubenswrapper[4669]: I1001 12:16:31.864088 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:17:01 crc kubenswrapper[4669]: I1001 12:17:01.863194 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:17:01 crc kubenswrapper[4669]: I1001 12:17:01.864020 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:17:01 crc kubenswrapper[4669]: I1001 12:17:01.864110 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:17:01 crc kubenswrapper[4669]: I1001 12:17:01.865399 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9de938d4db4a6b2603c7c88233b8827064d7ec05a2d9ca7acc89aebe3d5259a4"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:17:01 crc kubenswrapper[4669]: I1001 12:17:01.865465 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://9de938d4db4a6b2603c7c88233b8827064d7ec05a2d9ca7acc89aebe3d5259a4" gracePeriod=600 Oct 01 12:17:02 crc kubenswrapper[4669]: I1001 12:17:02.045006 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="9de938d4db4a6b2603c7c88233b8827064d7ec05a2d9ca7acc89aebe3d5259a4" exitCode=0 Oct 01 12:17:02 crc kubenswrapper[4669]: I1001 12:17:02.045066 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"9de938d4db4a6b2603c7c88233b8827064d7ec05a2d9ca7acc89aebe3d5259a4"} Oct 01 12:17:02 crc kubenswrapper[4669]: I1001 12:17:02.045145 4669 scope.go:117] "RemoveContainer" containerID="38e4f076ed5971c4a2cba9a7747fea4abc621c3e6a4533e3ac5e19d82e8855bc" Oct 01 12:17:04 crc kubenswrapper[4669]: E1001 12:17:04.662199 4669 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 01 12:17:04 crc kubenswrapper[4669]: E1001 12:17:04.663315 4669 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-phzg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(fce73f67-b429-4b4a-b873-a45f92d104c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:17:04 crc kubenswrapper[4669]: E1001 12:17:04.664511 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="fce73f67-b429-4b4a-b873-a45f92d104c7" Oct 01 12:17:05 crc kubenswrapper[4669]: I1001 12:17:05.083979 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f"} Oct 01 12:17:05 crc kubenswrapper[4669]: I1001 12:17:05.086475 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerStarted","Data":"01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d"} Oct 01 12:17:05 crc kubenswrapper[4669]: E1001 12:17:05.088191 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="fce73f67-b429-4b4a-b873-a45f92d104c7" Oct 01 12:17:06 crc kubenswrapper[4669]: I1001 12:17:06.107107 4669 generic.go:334] "Generic (PLEG): container finished" podID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerID="01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d" exitCode=0 Oct 01 12:17:06 crc kubenswrapper[4669]: I1001 12:17:06.107208 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerDied","Data":"01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d"} Oct 01 12:17:09 crc kubenswrapper[4669]: I1001 12:17:09.156272 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerStarted","Data":"1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be"} Oct 01 12:17:09 crc kubenswrapper[4669]: I1001 12:17:09.185651 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9pp96" podStartSLOduration=4.621416032 podStartE2EDuration="41.185628394s" podCreationTimestamp="2025-10-01 12:16:28 +0000 UTC" firstStartedPulling="2025-10-01 12:16:30.691317361 +0000 UTC m=+2881.790882348" lastFinishedPulling="2025-10-01 12:17:07.255529723 +0000 UTC m=+2918.355094710" observedRunningTime="2025-10-01 12:17:09.180961019 +0000 UTC m=+2920.280525996" watchObservedRunningTime="2025-10-01 12:17:09.185628394 +0000 UTC m=+2920.285193371" Oct 01 12:17:09 crc kubenswrapper[4669]: I1001 12:17:09.291654 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:17:09 crc kubenswrapper[4669]: I1001 12:17:09.291737 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:17:10 crc kubenswrapper[4669]: I1001 12:17:10.342057 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9pp96" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="registry-server" probeResult="failure" output=< Oct 01 12:17:10 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 12:17:10 crc kubenswrapper[4669]: > Oct 01 12:17:17 crc kubenswrapper[4669]: I1001 12:17:17.280587 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 12:17:19 crc kubenswrapper[4669]: I1001 12:17:19.274985 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fce73f67-b429-4b4a-b873-a45f92d104c7","Type":"ContainerStarted","Data":"b1a063941d16fd79a319621e61e3e40f314c439f5644ea6bfbcb2339f332596f"} Oct 01 12:17:19 crc kubenswrapper[4669]: I1001 12:17:19.297799 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.810677327 podStartE2EDuration="57.297770158s" podCreationTimestamp="2025-10-01 12:16:22 +0000 UTC" firstStartedPulling="2025-10-01 12:16:23.790572051 +0000 UTC m=+2874.890137028" lastFinishedPulling="2025-10-01 12:17:17.277664882 +0000 UTC m=+2928.377229859" observedRunningTime="2025-10-01 12:17:19.290737015 +0000 UTC m=+2930.390302002" watchObservedRunningTime="2025-10-01 12:17:19.297770158 +0000 UTC m=+2930.397335135" Oct 01 12:17:19 crc kubenswrapper[4669]: I1001 12:17:19.358785 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:17:19 crc kubenswrapper[4669]: I1001 12:17:19.438839 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:17:19 crc kubenswrapper[4669]: I1001 12:17:19.606629 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pp96"] Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.301047 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9pp96" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="registry-server" containerID="cri-o://1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be" gracePeriod=2 Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.816957 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.881836 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-catalog-content\") pod \"8223ffe0-afa2-4e11-836e-5361c5d265dd\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.882071 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-287hz\" (UniqueName: \"kubernetes.io/projected/8223ffe0-afa2-4e11-836e-5361c5d265dd-kube-api-access-287hz\") pod \"8223ffe0-afa2-4e11-836e-5361c5d265dd\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.882195 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-utilities\") pod \"8223ffe0-afa2-4e11-836e-5361c5d265dd\" (UID: \"8223ffe0-afa2-4e11-836e-5361c5d265dd\") " Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.883853 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-utilities" (OuterVolumeSpecName: "utilities") pod "8223ffe0-afa2-4e11-836e-5361c5d265dd" (UID: "8223ffe0-afa2-4e11-836e-5361c5d265dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.890096 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8223ffe0-afa2-4e11-836e-5361c5d265dd-kube-api-access-287hz" (OuterVolumeSpecName: "kube-api-access-287hz") pod "8223ffe0-afa2-4e11-836e-5361c5d265dd" (UID: "8223ffe0-afa2-4e11-836e-5361c5d265dd"). InnerVolumeSpecName "kube-api-access-287hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.984440 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-287hz\" (UniqueName: \"kubernetes.io/projected/8223ffe0-afa2-4e11-836e-5361c5d265dd-kube-api-access-287hz\") on node \"crc\" DevicePath \"\"" Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.984489 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:17:21 crc kubenswrapper[4669]: I1001 12:17:21.985650 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8223ffe0-afa2-4e11-836e-5361c5d265dd" (UID: "8223ffe0-afa2-4e11-836e-5361c5d265dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.086449 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8223ffe0-afa2-4e11-836e-5361c5d265dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.319540 4669 generic.go:334] "Generic (PLEG): container finished" podID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerID="1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be" exitCode=0 Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.319608 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerDied","Data":"1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be"} Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.319654 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9pp96" event={"ID":"8223ffe0-afa2-4e11-836e-5361c5d265dd","Type":"ContainerDied","Data":"b1485dcf5795b57a70c656845669dba4327cb5e3ae3c3042eb4b8a89defaee6e"} Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.319687 4669 scope.go:117] "RemoveContainer" containerID="1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.319913 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9pp96" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.369459 4669 scope.go:117] "RemoveContainer" containerID="01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.371992 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9pp96"] Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.384148 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9pp96"] Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.402596 4669 scope.go:117] "RemoveContainer" containerID="70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.465251 4669 scope.go:117] "RemoveContainer" containerID="1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be" Oct 01 12:17:22 crc kubenswrapper[4669]: E1001 12:17:22.465754 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be\": container with ID starting with 1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be not found: ID does not exist" containerID="1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.465793 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be"} err="failed to get container status \"1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be\": rpc error: code = NotFound desc = could not find container \"1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be\": container with ID starting with 1e7ab8abc02cbab38674f77c67cde5570152b188021bb1944b9b6af18176d9be not found: ID does not exist" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.465817 4669 scope.go:117] "RemoveContainer" containerID="01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d" Oct 01 12:17:22 crc kubenswrapper[4669]: E1001 12:17:22.466128 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d\": container with ID starting with 01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d not found: ID does not exist" containerID="01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.466160 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d"} err="failed to get container status \"01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d\": rpc error: code = NotFound desc = could not find container \"01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d\": container with ID starting with 01f3b6c6fb6b95ee34caa724e83979090cf7645392e0b945da841e12c051d88d not found: ID does not exist" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.466264 4669 scope.go:117] "RemoveContainer" containerID="70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9" Oct 01 12:17:22 crc kubenswrapper[4669]: E1001 12:17:22.466692 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9\": container with ID starting with 70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9 not found: ID does not exist" containerID="70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9" Oct 01 12:17:22 crc kubenswrapper[4669]: I1001 12:17:22.466720 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9"} err="failed to get container status \"70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9\": rpc error: code = NotFound desc = could not find container \"70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9\": container with ID starting with 70e4254d68b3b2a1373107595f3ea617c665e80fe6a89550bdc2dad37233d7b9 not found: ID does not exist" Oct 01 12:17:23 crc kubenswrapper[4669]: I1001 12:17:23.664884 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" path="/var/lib/kubelet/pods/8223ffe0-afa2-4e11-836e-5361c5d265dd/volumes" Oct 01 12:19:31 crc kubenswrapper[4669]: I1001 12:19:31.863631 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:19:31 crc kubenswrapper[4669]: I1001 12:19:31.865283 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:20:01 crc kubenswrapper[4669]: I1001 12:20:01.863470 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:20:01 crc kubenswrapper[4669]: I1001 12:20:01.864418 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:20:31 crc kubenswrapper[4669]: I1001 12:20:31.866837 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:20:31 crc kubenswrapper[4669]: I1001 12:20:31.868911 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:20:31 crc kubenswrapper[4669]: I1001 12:20:31.869099 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:20:31 crc kubenswrapper[4669]: I1001 12:20:31.870189 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:20:31 crc kubenswrapper[4669]: I1001 12:20:31.870374 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" gracePeriod=600 Oct 01 12:20:32 crc kubenswrapper[4669]: E1001 12:20:32.007137 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:20:32 crc kubenswrapper[4669]: I1001 12:20:32.507544 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" exitCode=0 Oct 01 12:20:32 crc kubenswrapper[4669]: I1001 12:20:32.507600 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f"} Oct 01 12:20:32 crc kubenswrapper[4669]: I1001 12:20:32.507645 4669 scope.go:117] "RemoveContainer" containerID="9de938d4db4a6b2603c7c88233b8827064d7ec05a2d9ca7acc89aebe3d5259a4" Oct 01 12:20:32 crc kubenswrapper[4669]: I1001 12:20:32.508368 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:20:32 crc kubenswrapper[4669]: E1001 12:20:32.508628 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:20:45 crc kubenswrapper[4669]: I1001 12:20:45.644958 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:20:45 crc kubenswrapper[4669]: E1001 12:20:45.646102 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:21:00 crc kubenswrapper[4669]: I1001 12:21:00.644683 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:21:00 crc kubenswrapper[4669]: E1001 12:21:00.645492 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:21:14 crc kubenswrapper[4669]: I1001 12:21:14.644018 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:21:14 crc kubenswrapper[4669]: E1001 12:21:14.644889 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:21:27 crc kubenswrapper[4669]: I1001 12:21:27.644704 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:21:27 crc kubenswrapper[4669]: E1001 12:21:27.645598 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:21:40 crc kubenswrapper[4669]: I1001 12:21:40.644643 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:21:40 crc kubenswrapper[4669]: E1001 12:21:40.645885 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:21:44 crc kubenswrapper[4669]: I1001 12:21:44.161286 4669 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6c769b8b9-5svbp" podUID="fd677364-3064-4b42-9555-b640561fa4ed" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 01 12:21:55 crc kubenswrapper[4669]: I1001 12:21:55.644306 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:21:55 crc kubenswrapper[4669]: E1001 12:21:55.645281 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:22:07 crc kubenswrapper[4669]: I1001 12:22:07.645034 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:22:07 crc kubenswrapper[4669]: E1001 12:22:07.646079 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:22:21 crc kubenswrapper[4669]: I1001 12:22:21.645481 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:22:21 crc kubenswrapper[4669]: E1001 12:22:21.646628 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:22:36 crc kubenswrapper[4669]: I1001 12:22:36.644839 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:22:36 crc kubenswrapper[4669]: E1001 12:22:36.647116 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:22:51 crc kubenswrapper[4669]: I1001 12:22:51.644818 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:22:51 crc kubenswrapper[4669]: E1001 12:22:51.646887 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:23:02 crc kubenswrapper[4669]: I1001 12:23:02.645765 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:23:02 crc kubenswrapper[4669]: E1001 12:23:02.647572 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:23:13 crc kubenswrapper[4669]: I1001 12:23:13.645423 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:23:13 crc kubenswrapper[4669]: E1001 12:23:13.648040 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:23:26 crc kubenswrapper[4669]: I1001 12:23:26.644755 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:23:26 crc kubenswrapper[4669]: E1001 12:23:26.646415 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:23:37 crc kubenswrapper[4669]: I1001 12:23:37.645293 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:23:37 crc kubenswrapper[4669]: E1001 12:23:37.646360 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:23:50 crc kubenswrapper[4669]: I1001 12:23:50.645376 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:23:50 crc kubenswrapper[4669]: E1001 12:23:50.646418 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:24:04 crc kubenswrapper[4669]: I1001 12:24:04.644531 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:24:04 crc kubenswrapper[4669]: E1001 12:24:04.645776 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:24:16 crc kubenswrapper[4669]: I1001 12:24:16.645222 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:24:16 crc kubenswrapper[4669]: E1001 12:24:16.647309 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.919989 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vw6dh"] Oct 01 12:24:23 crc kubenswrapper[4669]: E1001 12:24:23.921289 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="extract-content" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.921306 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="extract-content" Oct 01 12:24:23 crc kubenswrapper[4669]: E1001 12:24:23.921323 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="extract-utilities" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.921330 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="extract-utilities" Oct 01 12:24:23 crc kubenswrapper[4669]: E1001 12:24:23.921340 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="registry-server" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.921350 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="registry-server" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.921615 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8223ffe0-afa2-4e11-836e-5361c5d265dd" containerName="registry-server" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.923372 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:23 crc kubenswrapper[4669]: I1001 12:24:23.932298 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vw6dh"] Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.006376 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghpbl\" (UniqueName: \"kubernetes.io/projected/0aad4b2f-6730-4080-a09b-a01cefecf32b-kube-api-access-ghpbl\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.006542 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-catalog-content\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.006602 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-utilities\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.109389 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-catalog-content\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.109518 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-utilities\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.109708 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghpbl\" (UniqueName: \"kubernetes.io/projected/0aad4b2f-6730-4080-a09b-a01cefecf32b-kube-api-access-ghpbl\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.110050 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-utilities\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.110050 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-catalog-content\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.137443 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghpbl\" (UniqueName: \"kubernetes.io/projected/0aad4b2f-6730-4080-a09b-a01cefecf32b-kube-api-access-ghpbl\") pod \"redhat-marketplace-vw6dh\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.249932 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:24 crc kubenswrapper[4669]: I1001 12:24:24.726259 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vw6dh"] Oct 01 12:24:25 crc kubenswrapper[4669]: I1001 12:24:25.260185 4669 generic.go:334] "Generic (PLEG): container finished" podID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerID="0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db" exitCode=0 Oct 01 12:24:25 crc kubenswrapper[4669]: I1001 12:24:25.260281 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vw6dh" event={"ID":"0aad4b2f-6730-4080-a09b-a01cefecf32b","Type":"ContainerDied","Data":"0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db"} Oct 01 12:24:25 crc kubenswrapper[4669]: I1001 12:24:25.260643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vw6dh" event={"ID":"0aad4b2f-6730-4080-a09b-a01cefecf32b","Type":"ContainerStarted","Data":"1bc8761bf223ceaf6adbc24afe806a09fa22397bf8b0f7ca52a0270152c1791a"} Oct 01 12:24:25 crc kubenswrapper[4669]: I1001 12:24:25.264702 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:24:27 crc kubenswrapper[4669]: I1001 12:24:27.299138 4669 generic.go:334] "Generic (PLEG): container finished" podID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerID="30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76" exitCode=0 Oct 01 12:24:27 crc kubenswrapper[4669]: I1001 12:24:27.300226 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vw6dh" event={"ID":"0aad4b2f-6730-4080-a09b-a01cefecf32b","Type":"ContainerDied","Data":"30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76"} Oct 01 12:24:28 crc kubenswrapper[4669]: I1001 12:24:28.314462 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vw6dh" event={"ID":"0aad4b2f-6730-4080-a09b-a01cefecf32b","Type":"ContainerStarted","Data":"950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae"} Oct 01 12:24:28 crc kubenswrapper[4669]: I1001 12:24:28.336899 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vw6dh" podStartSLOduration=2.89304775 podStartE2EDuration="5.336878237s" podCreationTimestamp="2025-10-01 12:24:23 +0000 UTC" firstStartedPulling="2025-10-01 12:24:25.262988545 +0000 UTC m=+3356.362553562" lastFinishedPulling="2025-10-01 12:24:27.706819072 +0000 UTC m=+3358.806384049" observedRunningTime="2025-10-01 12:24:28.335483023 +0000 UTC m=+3359.435048040" watchObservedRunningTime="2025-10-01 12:24:28.336878237 +0000 UTC m=+3359.436443224" Oct 01 12:24:28 crc kubenswrapper[4669]: I1001 12:24:28.643782 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:24:28 crc kubenswrapper[4669]: E1001 12:24:28.644145 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:24:34 crc kubenswrapper[4669]: I1001 12:24:34.250633 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:34 crc kubenswrapper[4669]: I1001 12:24:34.251496 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:34 crc kubenswrapper[4669]: I1001 12:24:34.317382 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:34 crc kubenswrapper[4669]: I1001 12:24:34.445539 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:34 crc kubenswrapper[4669]: I1001 12:24:34.565126 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vw6dh"] Oct 01 12:24:36 crc kubenswrapper[4669]: I1001 12:24:36.410352 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vw6dh" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="registry-server" containerID="cri-o://950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae" gracePeriod=2 Oct 01 12:24:36 crc kubenswrapper[4669]: I1001 12:24:36.985830 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.024448 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-utilities\") pod \"0aad4b2f-6730-4080-a09b-a01cefecf32b\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.024672 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-catalog-content\") pod \"0aad4b2f-6730-4080-a09b-a01cefecf32b\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.024757 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghpbl\" (UniqueName: \"kubernetes.io/projected/0aad4b2f-6730-4080-a09b-a01cefecf32b-kube-api-access-ghpbl\") pod \"0aad4b2f-6730-4080-a09b-a01cefecf32b\" (UID: \"0aad4b2f-6730-4080-a09b-a01cefecf32b\") " Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.026312 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-utilities" (OuterVolumeSpecName: "utilities") pod "0aad4b2f-6730-4080-a09b-a01cefecf32b" (UID: "0aad4b2f-6730-4080-a09b-a01cefecf32b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.042214 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aad4b2f-6730-4080-a09b-a01cefecf32b-kube-api-access-ghpbl" (OuterVolumeSpecName: "kube-api-access-ghpbl") pod "0aad4b2f-6730-4080-a09b-a01cefecf32b" (UID: "0aad4b2f-6730-4080-a09b-a01cefecf32b"). InnerVolumeSpecName "kube-api-access-ghpbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.052067 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0aad4b2f-6730-4080-a09b-a01cefecf32b" (UID: "0aad4b2f-6730-4080-a09b-a01cefecf32b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.128377 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.128440 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aad4b2f-6730-4080-a09b-a01cefecf32b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.128463 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghpbl\" (UniqueName: \"kubernetes.io/projected/0aad4b2f-6730-4080-a09b-a01cefecf32b-kube-api-access-ghpbl\") on node \"crc\" DevicePath \"\"" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.429663 4669 generic.go:334] "Generic (PLEG): container finished" podID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerID="950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae" exitCode=0 Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.429749 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vw6dh" event={"ID":"0aad4b2f-6730-4080-a09b-a01cefecf32b","Type":"ContainerDied","Data":"950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae"} Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.429803 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vw6dh" event={"ID":"0aad4b2f-6730-4080-a09b-a01cefecf32b","Type":"ContainerDied","Data":"1bc8761bf223ceaf6adbc24afe806a09fa22397bf8b0f7ca52a0270152c1791a"} Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.429799 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vw6dh" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.429875 4669 scope.go:117] "RemoveContainer" containerID="950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.468342 4669 scope.go:117] "RemoveContainer" containerID="30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.501476 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vw6dh"] Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.511871 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vw6dh"] Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.515916 4669 scope.go:117] "RemoveContainer" containerID="0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.564738 4669 scope.go:117] "RemoveContainer" containerID="950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae" Oct 01 12:24:37 crc kubenswrapper[4669]: E1001 12:24:37.565604 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae\": container with ID starting with 950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae not found: ID does not exist" containerID="950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.565680 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae"} err="failed to get container status \"950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae\": rpc error: code = NotFound desc = could not find container \"950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae\": container with ID starting with 950471aa80847fcc416f8f5611d94381cbb5e6eca2123c5d6040bbda422c4cae not found: ID does not exist" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.565730 4669 scope.go:117] "RemoveContainer" containerID="30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76" Oct 01 12:24:37 crc kubenswrapper[4669]: E1001 12:24:37.566392 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76\": container with ID starting with 30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76 not found: ID does not exist" containerID="30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.566449 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76"} err="failed to get container status \"30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76\": rpc error: code = NotFound desc = could not find container \"30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76\": container with ID starting with 30b589b9e1bac0d8b98f9912499c57bcffa7e25ea64be5974745c18bf5e6af76 not found: ID does not exist" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.566488 4669 scope.go:117] "RemoveContainer" containerID="0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db" Oct 01 12:24:37 crc kubenswrapper[4669]: E1001 12:24:37.567219 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db\": container with ID starting with 0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db not found: ID does not exist" containerID="0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.567250 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db"} err="failed to get container status \"0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db\": rpc error: code = NotFound desc = could not find container \"0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db\": container with ID starting with 0f1e8c2e56121624cd4468cf2f9442120c7867c91ed51d6b8bcb41edd8d8a4db not found: ID does not exist" Oct 01 12:24:37 crc kubenswrapper[4669]: I1001 12:24:37.661887 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" path="/var/lib/kubelet/pods/0aad4b2f-6730-4080-a09b-a01cefecf32b/volumes" Oct 01 12:24:43 crc kubenswrapper[4669]: I1001 12:24:43.644337 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:24:43 crc kubenswrapper[4669]: E1001 12:24:43.645537 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:24:57 crc kubenswrapper[4669]: I1001 12:24:57.646511 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:24:57 crc kubenswrapper[4669]: E1001 12:24:57.647874 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:25:08 crc kubenswrapper[4669]: I1001 12:25:08.644972 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:25:08 crc kubenswrapper[4669]: E1001 12:25:08.646255 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.394100 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tj9cg"] Oct 01 12:25:13 crc kubenswrapper[4669]: E1001 12:25:13.395092 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="registry-server" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.395103 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="registry-server" Oct 01 12:25:13 crc kubenswrapper[4669]: E1001 12:25:13.395130 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="extract-content" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.395136 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="extract-content" Oct 01 12:25:13 crc kubenswrapper[4669]: E1001 12:25:13.395154 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="extract-utilities" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.395161 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="extract-utilities" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.395331 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aad4b2f-6730-4080-a09b-a01cefecf32b" containerName="registry-server" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.396661 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.428637 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tj9cg"] Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.439964 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-utilities\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.440340 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-catalog-content\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.440790 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k5xh\" (UniqueName: \"kubernetes.io/projected/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-kube-api-access-9k5xh\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.543643 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-utilities\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.543885 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-catalog-content\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.543967 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k5xh\" (UniqueName: \"kubernetes.io/projected/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-kube-api-access-9k5xh\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.544187 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-utilities\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.544456 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-catalog-content\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.572848 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k5xh\" (UniqueName: \"kubernetes.io/projected/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-kube-api-access-9k5xh\") pod \"certified-operators-tj9cg\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:13 crc kubenswrapper[4669]: I1001 12:25:13.735528 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:14 crc kubenswrapper[4669]: I1001 12:25:14.320623 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tj9cg"] Oct 01 12:25:14 crc kubenswrapper[4669]: W1001 12:25:14.331206 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda92224f7_f9af_46e9_a049_bf7db8c9a8e1.slice/crio-71bd9cc95e2b1b5610a679cf8efd45e08682e6aacd9632dd18be6d5618132b16 WatchSource:0}: Error finding container 71bd9cc95e2b1b5610a679cf8efd45e08682e6aacd9632dd18be6d5618132b16: Status 404 returned error can't find the container with id 71bd9cc95e2b1b5610a679cf8efd45e08682e6aacd9632dd18be6d5618132b16 Oct 01 12:25:14 crc kubenswrapper[4669]: I1001 12:25:14.912667 4669 generic.go:334] "Generic (PLEG): container finished" podID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerID="fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7" exitCode=0 Oct 01 12:25:14 crc kubenswrapper[4669]: I1001 12:25:14.912773 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj9cg" event={"ID":"a92224f7-f9af-46e9-a049-bf7db8c9a8e1","Type":"ContainerDied","Data":"fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7"} Oct 01 12:25:14 crc kubenswrapper[4669]: I1001 12:25:14.913066 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj9cg" event={"ID":"a92224f7-f9af-46e9-a049-bf7db8c9a8e1","Type":"ContainerStarted","Data":"71bd9cc95e2b1b5610a679cf8efd45e08682e6aacd9632dd18be6d5618132b16"} Oct 01 12:25:16 crc kubenswrapper[4669]: I1001 12:25:16.938051 4669 generic.go:334] "Generic (PLEG): container finished" podID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerID="0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892" exitCode=0 Oct 01 12:25:16 crc kubenswrapper[4669]: I1001 12:25:16.938130 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj9cg" event={"ID":"a92224f7-f9af-46e9-a049-bf7db8c9a8e1","Type":"ContainerDied","Data":"0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892"} Oct 01 12:25:18 crc kubenswrapper[4669]: I1001 12:25:18.977685 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj9cg" event={"ID":"a92224f7-f9af-46e9-a049-bf7db8c9a8e1","Type":"ContainerStarted","Data":"cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f"} Oct 01 12:25:19 crc kubenswrapper[4669]: I1001 12:25:19.012395 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tj9cg" podStartSLOduration=2.964744447 podStartE2EDuration="6.012356289s" podCreationTimestamp="2025-10-01 12:25:13 +0000 UTC" firstStartedPulling="2025-10-01 12:25:14.916621202 +0000 UTC m=+3406.016186199" lastFinishedPulling="2025-10-01 12:25:17.964233064 +0000 UTC m=+3409.063798041" observedRunningTime="2025-10-01 12:25:18.996195171 +0000 UTC m=+3410.095760148" watchObservedRunningTime="2025-10-01 12:25:19.012356289 +0000 UTC m=+3410.111921276" Oct 01 12:25:21 crc kubenswrapper[4669]: I1001 12:25:21.644306 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:25:21 crc kubenswrapper[4669]: E1001 12:25:21.644947 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:25:23 crc kubenswrapper[4669]: I1001 12:25:23.736165 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:23 crc kubenswrapper[4669]: I1001 12:25:23.736789 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:23 crc kubenswrapper[4669]: I1001 12:25:23.812921 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:24 crc kubenswrapper[4669]: I1001 12:25:24.092803 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:24 crc kubenswrapper[4669]: I1001 12:25:24.163414 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tj9cg"] Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.053367 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tj9cg" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="registry-server" containerID="cri-o://cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f" gracePeriod=2 Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.677514 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.773250 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-utilities\") pod \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.773352 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k5xh\" (UniqueName: \"kubernetes.io/projected/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-kube-api-access-9k5xh\") pod \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.773508 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-catalog-content\") pod \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\" (UID: \"a92224f7-f9af-46e9-a049-bf7db8c9a8e1\") " Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.773833 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-utilities" (OuterVolumeSpecName: "utilities") pod "a92224f7-f9af-46e9-a049-bf7db8c9a8e1" (UID: "a92224f7-f9af-46e9-a049-bf7db8c9a8e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.774404 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.785892 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-kube-api-access-9k5xh" (OuterVolumeSpecName: "kube-api-access-9k5xh") pod "a92224f7-f9af-46e9-a049-bf7db8c9a8e1" (UID: "a92224f7-f9af-46e9-a049-bf7db8c9a8e1"). InnerVolumeSpecName "kube-api-access-9k5xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.824561 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a92224f7-f9af-46e9-a049-bf7db8c9a8e1" (UID: "a92224f7-f9af-46e9-a049-bf7db8c9a8e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.875899 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k5xh\" (UniqueName: \"kubernetes.io/projected/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-kube-api-access-9k5xh\") on node \"crc\" DevicePath \"\"" Oct 01 12:25:26 crc kubenswrapper[4669]: I1001 12:25:26.875938 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92224f7-f9af-46e9-a049-bf7db8c9a8e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.068299 4669 generic.go:334] "Generic (PLEG): container finished" podID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerID="cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f" exitCode=0 Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.068379 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj9cg" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.068385 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj9cg" event={"ID":"a92224f7-f9af-46e9-a049-bf7db8c9a8e1","Type":"ContainerDied","Data":"cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f"} Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.068546 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj9cg" event={"ID":"a92224f7-f9af-46e9-a049-bf7db8c9a8e1","Type":"ContainerDied","Data":"71bd9cc95e2b1b5610a679cf8efd45e08682e6aacd9632dd18be6d5618132b16"} Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.068591 4669 scope.go:117] "RemoveContainer" containerID="cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.114265 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tj9cg"] Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.123514 4669 scope.go:117] "RemoveContainer" containerID="0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.124601 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tj9cg"] Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.150217 4669 scope.go:117] "RemoveContainer" containerID="fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.200896 4669 scope.go:117] "RemoveContainer" containerID="cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f" Oct 01 12:25:27 crc kubenswrapper[4669]: E1001 12:25:27.201481 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f\": container with ID starting with cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f not found: ID does not exist" containerID="cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.201618 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f"} err="failed to get container status \"cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f\": rpc error: code = NotFound desc = could not find container \"cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f\": container with ID starting with cb25abc4fd1755eff8479d45137df1afb3be4beac9e0d41e55adcf98142f6d4f not found: ID does not exist" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.201727 4669 scope.go:117] "RemoveContainer" containerID="0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892" Oct 01 12:25:27 crc kubenswrapper[4669]: E1001 12:25:27.202228 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892\": container with ID starting with 0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892 not found: ID does not exist" containerID="0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.202259 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892"} err="failed to get container status \"0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892\": rpc error: code = NotFound desc = could not find container \"0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892\": container with ID starting with 0c74343bc87bd730549d4420ba7e09ab41cfbdd8a964fc828fd1dbca143f2892 not found: ID does not exist" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.202278 4669 scope.go:117] "RemoveContainer" containerID="fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7" Oct 01 12:25:27 crc kubenswrapper[4669]: E1001 12:25:27.202693 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7\": container with ID starting with fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7 not found: ID does not exist" containerID="fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.202803 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7"} err="failed to get container status \"fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7\": rpc error: code = NotFound desc = could not find container \"fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7\": container with ID starting with fc624a458b263f11af7941d1be43b042aa0a0464bc6fbab3b1d796d8204daad7 not found: ID does not exist" Oct 01 12:25:27 crc kubenswrapper[4669]: I1001 12:25:27.664008 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" path="/var/lib/kubelet/pods/a92224f7-f9af-46e9-a049-bf7db8c9a8e1/volumes" Oct 01 12:25:34 crc kubenswrapper[4669]: I1001 12:25:34.645484 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:25:35 crc kubenswrapper[4669]: I1001 12:25:35.151281 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"784c7f7fc27aa6a93a1fa55ebe85565db9e1e1b5c58371a518406bc62cba9814"} Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.354190 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h9hpb"] Oct 01 12:27:17 crc kubenswrapper[4669]: E1001 12:27:17.355537 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="registry-server" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.355561 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="registry-server" Oct 01 12:27:17 crc kubenswrapper[4669]: E1001 12:27:17.355587 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="extract-utilities" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.355598 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="extract-utilities" Oct 01 12:27:17 crc kubenswrapper[4669]: E1001 12:27:17.355631 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="extract-content" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.355644 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="extract-content" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.355955 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92224f7-f9af-46e9-a049-bf7db8c9a8e1" containerName="registry-server" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.357900 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.378150 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h9hpb"] Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.522831 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-utilities\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.523953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjktl\" (UniqueName: \"kubernetes.io/projected/3a4abb5e-f277-4e3f-816d-b67cc195bc67-kube-api-access-hjktl\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.524025 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-catalog-content\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.626537 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-utilities\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.627104 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-utilities\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.627286 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjktl\" (UniqueName: \"kubernetes.io/projected/3a4abb5e-f277-4e3f-816d-b67cc195bc67-kube-api-access-hjktl\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.627366 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-catalog-content\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.627690 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-catalog-content\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.658336 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjktl\" (UniqueName: \"kubernetes.io/projected/3a4abb5e-f277-4e3f-816d-b67cc195bc67-kube-api-access-hjktl\") pod \"community-operators-h9hpb\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:17 crc kubenswrapper[4669]: I1001 12:27:17.681707 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:18 crc kubenswrapper[4669]: I1001 12:27:18.252993 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h9hpb"] Oct 01 12:27:18 crc kubenswrapper[4669]: I1001 12:27:18.434282 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9hpb" event={"ID":"3a4abb5e-f277-4e3f-816d-b67cc195bc67","Type":"ContainerStarted","Data":"885bae74a5af3af37099a0a719f2462dce79bce9e49cb48183d122a587b5be45"} Oct 01 12:27:18 crc kubenswrapper[4669]: E1001 12:27:18.774320 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a4abb5e_f277_4e3f_816d_b67cc195bc67.slice/crio-conmon-59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:27:19 crc kubenswrapper[4669]: I1001 12:27:19.448472 4669 generic.go:334] "Generic (PLEG): container finished" podID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerID="59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5" exitCode=0 Oct 01 12:27:19 crc kubenswrapper[4669]: I1001 12:27:19.448953 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9hpb" event={"ID":"3a4abb5e-f277-4e3f-816d-b67cc195bc67","Type":"ContainerDied","Data":"59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5"} Oct 01 12:27:21 crc kubenswrapper[4669]: I1001 12:27:21.473112 4669 generic.go:334] "Generic (PLEG): container finished" podID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerID="a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b" exitCode=0 Oct 01 12:27:21 crc kubenswrapper[4669]: I1001 12:27:21.473820 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9hpb" event={"ID":"3a4abb5e-f277-4e3f-816d-b67cc195bc67","Type":"ContainerDied","Data":"a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b"} Oct 01 12:27:22 crc kubenswrapper[4669]: I1001 12:27:22.494878 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9hpb" event={"ID":"3a4abb5e-f277-4e3f-816d-b67cc195bc67","Type":"ContainerStarted","Data":"e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2"} Oct 01 12:27:22 crc kubenswrapper[4669]: I1001 12:27:22.521498 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h9hpb" podStartSLOduration=2.906704424 podStartE2EDuration="5.521467897s" podCreationTimestamp="2025-10-01 12:27:17 +0000 UTC" firstStartedPulling="2025-10-01 12:27:19.453407991 +0000 UTC m=+3530.552973008" lastFinishedPulling="2025-10-01 12:27:22.068171514 +0000 UTC m=+3533.167736481" observedRunningTime="2025-10-01 12:27:22.515997782 +0000 UTC m=+3533.615562759" watchObservedRunningTime="2025-10-01 12:27:22.521467897 +0000 UTC m=+3533.621032884" Oct 01 12:27:27 crc kubenswrapper[4669]: I1001 12:27:27.682264 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:27 crc kubenswrapper[4669]: I1001 12:27:27.684465 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:27 crc kubenswrapper[4669]: I1001 12:27:27.743307 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:28 crc kubenswrapper[4669]: I1001 12:27:28.632468 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:28 crc kubenswrapper[4669]: I1001 12:27:28.698072 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h9hpb"] Oct 01 12:27:30 crc kubenswrapper[4669]: I1001 12:27:30.585265 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h9hpb" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="registry-server" containerID="cri-o://e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2" gracePeriod=2 Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.185301 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.374732 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjktl\" (UniqueName: \"kubernetes.io/projected/3a4abb5e-f277-4e3f-816d-b67cc195bc67-kube-api-access-hjktl\") pod \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.374963 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-utilities\") pod \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.375157 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-catalog-content\") pod \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\" (UID: \"3a4abb5e-f277-4e3f-816d-b67cc195bc67\") " Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.375810 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-utilities" (OuterVolumeSpecName: "utilities") pod "3a4abb5e-f277-4e3f-816d-b67cc195bc67" (UID: "3a4abb5e-f277-4e3f-816d-b67cc195bc67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.375962 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.383387 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4abb5e-f277-4e3f-816d-b67cc195bc67-kube-api-access-hjktl" (OuterVolumeSpecName: "kube-api-access-hjktl") pod "3a4abb5e-f277-4e3f-816d-b67cc195bc67" (UID: "3a4abb5e-f277-4e3f-816d-b67cc195bc67"). InnerVolumeSpecName "kube-api-access-hjktl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.451783 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a4abb5e-f277-4e3f-816d-b67cc195bc67" (UID: "3a4abb5e-f277-4e3f-816d-b67cc195bc67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.478194 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a4abb5e-f277-4e3f-816d-b67cc195bc67-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.478651 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjktl\" (UniqueName: \"kubernetes.io/projected/3a4abb5e-f277-4e3f-816d-b67cc195bc67-kube-api-access-hjktl\") on node \"crc\" DevicePath \"\"" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.601986 4669 generic.go:334] "Generic (PLEG): container finished" podID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerID="e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2" exitCode=0 Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.602062 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9hpb" event={"ID":"3a4abb5e-f277-4e3f-816d-b67cc195bc67","Type":"ContainerDied","Data":"e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2"} Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.602149 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9hpb" event={"ID":"3a4abb5e-f277-4e3f-816d-b67cc195bc67","Type":"ContainerDied","Data":"885bae74a5af3af37099a0a719f2462dce79bce9e49cb48183d122a587b5be45"} Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.602073 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9hpb" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.602170 4669 scope.go:117] "RemoveContainer" containerID="e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.626690 4669 scope.go:117] "RemoveContainer" containerID="a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.661904 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h9hpb"] Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.669690 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h9hpb"] Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.676008 4669 scope.go:117] "RemoveContainer" containerID="59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.736877 4669 scope.go:117] "RemoveContainer" containerID="e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2" Oct 01 12:27:31 crc kubenswrapper[4669]: E1001 12:27:31.737675 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2\": container with ID starting with e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2 not found: ID does not exist" containerID="e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.737773 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2"} err="failed to get container status \"e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2\": rpc error: code = NotFound desc = could not find container \"e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2\": container with ID starting with e3aa789b2a9cb81e33e04d31704a1f077a4e3413bc7e0e4b4fc8498070e5dae2 not found: ID does not exist" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.737840 4669 scope.go:117] "RemoveContainer" containerID="a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b" Oct 01 12:27:31 crc kubenswrapper[4669]: E1001 12:27:31.738335 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b\": container with ID starting with a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b not found: ID does not exist" containerID="a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.738377 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b"} err="failed to get container status \"a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b\": rpc error: code = NotFound desc = could not find container \"a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b\": container with ID starting with a91e2aa021110f208fc74468614079d3ff5e31dad390557dd4d839158c7f5e1b not found: ID does not exist" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.738407 4669 scope.go:117] "RemoveContainer" containerID="59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5" Oct 01 12:27:31 crc kubenswrapper[4669]: E1001 12:27:31.739832 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5\": container with ID starting with 59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5 not found: ID does not exist" containerID="59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5" Oct 01 12:27:31 crc kubenswrapper[4669]: I1001 12:27:31.739920 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5"} err="failed to get container status \"59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5\": rpc error: code = NotFound desc = could not find container \"59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5\": container with ID starting with 59dca1e0750722425733bccdcd184ddb7edc9eaf3d4af5a097197f898ff139c5 not found: ID does not exist" Oct 01 12:27:33 crc kubenswrapper[4669]: I1001 12:27:33.670386 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" path="/var/lib/kubelet/pods/3a4abb5e-f277-4e3f-816d-b67cc195bc67/volumes" Oct 01 12:28:01 crc kubenswrapper[4669]: I1001 12:28:01.863799 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:28:01 crc kubenswrapper[4669]: I1001 12:28:01.864719 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:28:18 crc kubenswrapper[4669]: I1001 12:28:18.985331 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wk9t5"] Oct 01 12:28:18 crc kubenswrapper[4669]: E1001 12:28:18.986624 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="extract-utilities" Oct 01 12:28:18 crc kubenswrapper[4669]: I1001 12:28:18.986642 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="extract-utilities" Oct 01 12:28:18 crc kubenswrapper[4669]: E1001 12:28:18.986657 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="registry-server" Oct 01 12:28:18 crc kubenswrapper[4669]: I1001 12:28:18.986664 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="registry-server" Oct 01 12:28:18 crc kubenswrapper[4669]: E1001 12:28:18.986716 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="extract-content" Oct 01 12:28:18 crc kubenswrapper[4669]: I1001 12:28:18.986724 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="extract-content" Oct 01 12:28:18 crc kubenswrapper[4669]: I1001 12:28:18.986948 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4abb5e-f277-4e3f-816d-b67cc195bc67" containerName="registry-server" Oct 01 12:28:18 crc kubenswrapper[4669]: I1001 12:28:18.988500 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.007257 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wk9t5"] Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.070715 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-utilities\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.070818 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-catalog-content\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.070853 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rzp9\" (UniqueName: \"kubernetes.io/projected/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-kube-api-access-7rzp9\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.172345 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-utilities\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.172779 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-catalog-content\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.172911 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rzp9\" (UniqueName: \"kubernetes.io/projected/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-kube-api-access-7rzp9\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.172826 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-utilities\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.173372 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-catalog-content\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.204494 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rzp9\" (UniqueName: \"kubernetes.io/projected/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-kube-api-access-7rzp9\") pod \"redhat-operators-wk9t5\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.329905 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:19 crc kubenswrapper[4669]: I1001 12:28:19.815021 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wk9t5"] Oct 01 12:28:20 crc kubenswrapper[4669]: I1001 12:28:20.181658 4669 generic.go:334] "Generic (PLEG): container finished" podID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerID="70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d" exitCode=0 Oct 01 12:28:20 crc kubenswrapper[4669]: I1001 12:28:20.181784 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerDied","Data":"70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d"} Oct 01 12:28:20 crc kubenswrapper[4669]: I1001 12:28:20.182286 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerStarted","Data":"6850fc1351690bf3c9e4dfe9246b916c1a57f4073fb757b83b1ba785dcd58d1b"} Oct 01 12:28:22 crc kubenswrapper[4669]: I1001 12:28:22.206951 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerStarted","Data":"8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf"} Oct 01 12:28:23 crc kubenswrapper[4669]: I1001 12:28:23.220642 4669 generic.go:334] "Generic (PLEG): container finished" podID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerID="8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf" exitCode=0 Oct 01 12:28:23 crc kubenswrapper[4669]: I1001 12:28:23.220735 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerDied","Data":"8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf"} Oct 01 12:28:24 crc kubenswrapper[4669]: I1001 12:28:24.234615 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerStarted","Data":"a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a"} Oct 01 12:28:24 crc kubenswrapper[4669]: I1001 12:28:24.259620 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wk9t5" podStartSLOduration=2.42201616 podStartE2EDuration="6.259594619s" podCreationTimestamp="2025-10-01 12:28:18 +0000 UTC" firstStartedPulling="2025-10-01 12:28:20.184377889 +0000 UTC m=+3591.283942866" lastFinishedPulling="2025-10-01 12:28:24.021956358 +0000 UTC m=+3595.121521325" observedRunningTime="2025-10-01 12:28:24.256809111 +0000 UTC m=+3595.356374088" watchObservedRunningTime="2025-10-01 12:28:24.259594619 +0000 UTC m=+3595.359159606" Oct 01 12:28:29 crc kubenswrapper[4669]: I1001 12:28:29.330706 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:29 crc kubenswrapper[4669]: I1001 12:28:29.332346 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:30 crc kubenswrapper[4669]: I1001 12:28:30.413221 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wk9t5" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="registry-server" probeResult="failure" output=< Oct 01 12:28:30 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 12:28:30 crc kubenswrapper[4669]: > Oct 01 12:28:31 crc kubenswrapper[4669]: I1001 12:28:31.863217 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:28:31 crc kubenswrapper[4669]: I1001 12:28:31.863298 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:28:40 crc kubenswrapper[4669]: I1001 12:28:40.389650 4669 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wk9t5" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="registry-server" probeResult="failure" output=< Oct 01 12:28:40 crc kubenswrapper[4669]: timeout: failed to connect service ":50051" within 1s Oct 01 12:28:40 crc kubenswrapper[4669]: > Oct 01 12:28:49 crc kubenswrapper[4669]: I1001 12:28:49.414625 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:49 crc kubenswrapper[4669]: I1001 12:28:49.479970 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:50 crc kubenswrapper[4669]: I1001 12:28:50.182054 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wk9t5"] Oct 01 12:28:50 crc kubenswrapper[4669]: I1001 12:28:50.574815 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wk9t5" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="registry-server" containerID="cri-o://a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a" gracePeriod=2 Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.268860 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.382332 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-utilities\") pod \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.382458 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-catalog-content\") pod \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.382662 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rzp9\" (UniqueName: \"kubernetes.io/projected/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-kube-api-access-7rzp9\") pod \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\" (UID: \"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d\") " Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.383694 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-utilities" (OuterVolumeSpecName: "utilities") pod "621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" (UID: "621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.392161 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-kube-api-access-7rzp9" (OuterVolumeSpecName: "kube-api-access-7rzp9") pod "621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" (UID: "621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d"). InnerVolumeSpecName "kube-api-access-7rzp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.485364 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rzp9\" (UniqueName: \"kubernetes.io/projected/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-kube-api-access-7rzp9\") on node \"crc\" DevicePath \"\"" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.485388 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.488645 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" (UID: "621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.587876 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.589571 4669 generic.go:334] "Generic (PLEG): container finished" podID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerID="a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a" exitCode=0 Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.589643 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerDied","Data":"a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a"} Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.589675 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wk9t5" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.589736 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wk9t5" event={"ID":"621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d","Type":"ContainerDied","Data":"6850fc1351690bf3c9e4dfe9246b916c1a57f4073fb757b83b1ba785dcd58d1b"} Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.589777 4669 scope.go:117] "RemoveContainer" containerID="a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.641054 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wk9t5"] Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.647382 4669 scope.go:117] "RemoveContainer" containerID="8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.663522 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wk9t5"] Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.677411 4669 scope.go:117] "RemoveContainer" containerID="70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.733195 4669 scope.go:117] "RemoveContainer" containerID="a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a" Oct 01 12:28:51 crc kubenswrapper[4669]: E1001 12:28:51.734142 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a\": container with ID starting with a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a not found: ID does not exist" containerID="a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.734232 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a"} err="failed to get container status \"a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a\": rpc error: code = NotFound desc = could not find container \"a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a\": container with ID starting with a7b623e1d8bc1ea17f9417c5283ba7f3ff9f815d9fcccfa99c432df420e7aa2a not found: ID does not exist" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.734272 4669 scope.go:117] "RemoveContainer" containerID="8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf" Oct 01 12:28:51 crc kubenswrapper[4669]: E1001 12:28:51.734837 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf\": container with ID starting with 8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf not found: ID does not exist" containerID="8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.734899 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf"} err="failed to get container status \"8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf\": rpc error: code = NotFound desc = could not find container \"8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf\": container with ID starting with 8622eaa8bf3985a61bea7be4f2a7bf137baf19e21f440606bcb559e66559dcdf not found: ID does not exist" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.734946 4669 scope.go:117] "RemoveContainer" containerID="70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d" Oct 01 12:28:51 crc kubenswrapper[4669]: E1001 12:28:51.735440 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d\": container with ID starting with 70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d not found: ID does not exist" containerID="70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d" Oct 01 12:28:51 crc kubenswrapper[4669]: I1001 12:28:51.735514 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d"} err="failed to get container status \"70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d\": rpc error: code = NotFound desc = could not find container \"70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d\": container with ID starting with 70148fc3e9ea5326ff49f41bd309e465711e486f423d608296cca888e3b7ea7d not found: ID does not exist" Oct 01 12:28:53 crc kubenswrapper[4669]: I1001 12:28:53.668481 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" path="/var/lib/kubelet/pods/621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d/volumes" Oct 01 12:29:01 crc kubenswrapper[4669]: I1001 12:29:01.869284 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:29:01 crc kubenswrapper[4669]: I1001 12:29:01.870604 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:29:01 crc kubenswrapper[4669]: I1001 12:29:01.870721 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:29:01 crc kubenswrapper[4669]: I1001 12:29:01.872579 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"784c7f7fc27aa6a93a1fa55ebe85565db9e1e1b5c58371a518406bc62cba9814"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:29:01 crc kubenswrapper[4669]: I1001 12:29:01.880071 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://784c7f7fc27aa6a93a1fa55ebe85565db9e1e1b5c58371a518406bc62cba9814" gracePeriod=600 Oct 01 12:29:02 crc kubenswrapper[4669]: I1001 12:29:02.719568 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="784c7f7fc27aa6a93a1fa55ebe85565db9e1e1b5c58371a518406bc62cba9814" exitCode=0 Oct 01 12:29:02 crc kubenswrapper[4669]: I1001 12:29:02.719632 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"784c7f7fc27aa6a93a1fa55ebe85565db9e1e1b5c58371a518406bc62cba9814"} Oct 01 12:29:02 crc kubenswrapper[4669]: I1001 12:29:02.720155 4669 scope.go:117] "RemoveContainer" containerID="017ae12737b89ea06d31f9642ccdc81bfe2d55b3e38aa4593883cf050f1d469f" Oct 01 12:29:03 crc kubenswrapper[4669]: I1001 12:29:03.732852 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c"} Oct 01 12:29:33 crc kubenswrapper[4669]: I1001 12:29:33.088402 4669 generic.go:334] "Generic (PLEG): container finished" podID="fce73f67-b429-4b4a-b873-a45f92d104c7" containerID="b1a063941d16fd79a319621e61e3e40f314c439f5644ea6bfbcb2339f332596f" exitCode=0 Oct 01 12:29:33 crc kubenswrapper[4669]: I1001 12:29:33.088483 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fce73f67-b429-4b4a-b873-a45f92d104c7","Type":"ContainerDied","Data":"b1a063941d16fd79a319621e61e3e40f314c439f5644ea6bfbcb2339f332596f"} Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.591453 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.765525 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.765670 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config-secret\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.765722 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-config-data\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.765921 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.765980 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-temporary\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.766021 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ca-certs\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.766101 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ssh-key\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.766205 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phzg9\" (UniqueName: \"kubernetes.io/projected/fce73f67-b429-4b4a-b873-a45f92d104c7-kube-api-access-phzg9\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.766284 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-workdir\") pod \"fce73f67-b429-4b4a-b873-a45f92d104c7\" (UID: \"fce73f67-b429-4b4a-b873-a45f92d104c7\") " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.766798 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.767129 4669 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.767960 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-config-data" (OuterVolumeSpecName: "config-data") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.771861 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.778497 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.778567 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce73f67-b429-4b4a-b873-a45f92d104c7-kube-api-access-phzg9" (OuterVolumeSpecName: "kube-api-access-phzg9") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "kube-api-access-phzg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.824460 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.831538 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.834717 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.835187 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fce73f67-b429-4b4a-b873-a45f92d104c7" (UID: "fce73f67-b429-4b4a-b873-a45f92d104c7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870313 4669 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870364 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870377 4669 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870387 4669 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fce73f67-b429-4b4a-b873-a45f92d104c7-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870398 4669 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870409 4669 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce73f67-b429-4b4a-b873-a45f92d104c7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870419 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phzg9\" (UniqueName: \"kubernetes.io/projected/fce73f67-b429-4b4a-b873-a45f92d104c7-kube-api-access-phzg9\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.870436 4669 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fce73f67-b429-4b4a-b873-a45f92d104c7-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.902568 4669 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 01 12:29:34 crc kubenswrapper[4669]: I1001 12:29:34.972348 4669 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:29:35 crc kubenswrapper[4669]: I1001 12:29:35.115996 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fce73f67-b429-4b4a-b873-a45f92d104c7","Type":"ContainerDied","Data":"85ceade58cb0c4740064a6b19afc7ce29bdeebb3747ae5e42f2e634a68ad6a73"} Oct 01 12:29:35 crc kubenswrapper[4669]: I1001 12:29:35.116489 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ceade58cb0c4740064a6b19afc7ce29bdeebb3747ae5e42f2e634a68ad6a73" Oct 01 12:29:35 crc kubenswrapper[4669]: I1001 12:29:35.116137 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.578673 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 12:29:38 crc kubenswrapper[4669]: E1001 12:29:38.579494 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="extract-content" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.579526 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="extract-content" Oct 01 12:29:38 crc kubenswrapper[4669]: E1001 12:29:38.579561 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce73f67-b429-4b4a-b873-a45f92d104c7" containerName="tempest-tests-tempest-tests-runner" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.579580 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce73f67-b429-4b4a-b873-a45f92d104c7" containerName="tempest-tests-tempest-tests-runner" Oct 01 12:29:38 crc kubenswrapper[4669]: E1001 12:29:38.579606 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="extract-utilities" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.579626 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="extract-utilities" Oct 01 12:29:38 crc kubenswrapper[4669]: E1001 12:29:38.579660 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="registry-server" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.579677 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="registry-server" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.580208 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce73f67-b429-4b4a-b873-a45f92d104c7" containerName="tempest-tests-tempest-tests-runner" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.580256 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="621ceb7c-4687-4dbb-9fb6-ccfd3f2e828d" containerName="registry-server" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.581441 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.593330 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6zxgp" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.613708 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.667953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45v8z\" (UniqueName: \"kubernetes.io/projected/b2e123bd-d4e4-4b23-a8e0-07ea01e2c586-kube-api-access-45v8z\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.668231 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.770845 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45v8z\" (UniqueName: \"kubernetes.io/projected/b2e123bd-d4e4-4b23-a8e0-07ea01e2c586-kube-api-access-45v8z\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.771239 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.772159 4669 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.810662 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45v8z\" (UniqueName: \"kubernetes.io/projected/b2e123bd-d4e4-4b23-a8e0-07ea01e2c586-kube-api-access-45v8z\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.813404 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:38 crc kubenswrapper[4669]: I1001 12:29:38.919516 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 12:29:39 crc kubenswrapper[4669]: I1001 12:29:39.463325 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:29:39 crc kubenswrapper[4669]: I1001 12:29:39.464270 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 12:29:40 crc kubenswrapper[4669]: I1001 12:29:40.185609 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586","Type":"ContainerStarted","Data":"cc13c0dcc1fc3ab74a96301a086ed29080fcf29e158d159800691b56a57a6d89"} Oct 01 12:29:41 crc kubenswrapper[4669]: I1001 12:29:41.197934 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b2e123bd-d4e4-4b23-a8e0-07ea01e2c586","Type":"ContainerStarted","Data":"d69b22024e909840221b4ef078ec94e518f530f141de6b384c8bf02afe653eea"} Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.706110 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=18.814909876 podStartE2EDuration="19.706088315s" podCreationTimestamp="2025-10-01 12:29:38 +0000 UTC" firstStartedPulling="2025-10-01 12:29:39.463133771 +0000 UTC m=+3670.562698748" lastFinishedPulling="2025-10-01 12:29:40.35431221 +0000 UTC m=+3671.453877187" observedRunningTime="2025-10-01 12:29:41.221560797 +0000 UTC m=+3672.321125774" watchObservedRunningTime="2025-10-01 12:29:57.706088315 +0000 UTC m=+3688.805653302" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.712598 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d4b5q/must-gather-4wb7b"] Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.714477 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.716738 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d4b5q"/"kube-root-ca.crt" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.717107 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d4b5q"/"default-dockercfg-sqhxz" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.717321 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d4b5q"/"openshift-service-ca.crt" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.724874 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d4b5q/must-gather-4wb7b"] Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.854178 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/148ef6c2-2929-4c02-9f48-5f292bceba0c-must-gather-output\") pod \"must-gather-4wb7b\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.854223 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndzms\" (UniqueName: \"kubernetes.io/projected/148ef6c2-2929-4c02-9f48-5f292bceba0c-kube-api-access-ndzms\") pod \"must-gather-4wb7b\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.956508 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/148ef6c2-2929-4c02-9f48-5f292bceba0c-must-gather-output\") pod \"must-gather-4wb7b\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.956572 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndzms\" (UniqueName: \"kubernetes.io/projected/148ef6c2-2929-4c02-9f48-5f292bceba0c-kube-api-access-ndzms\") pod \"must-gather-4wb7b\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.957021 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/148ef6c2-2929-4c02-9f48-5f292bceba0c-must-gather-output\") pod \"must-gather-4wb7b\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:57 crc kubenswrapper[4669]: I1001 12:29:57.981888 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndzms\" (UniqueName: \"kubernetes.io/projected/148ef6c2-2929-4c02-9f48-5f292bceba0c-kube-api-access-ndzms\") pod \"must-gather-4wb7b\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:58 crc kubenswrapper[4669]: I1001 12:29:58.034674 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:29:58 crc kubenswrapper[4669]: I1001 12:29:58.544106 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d4b5q/must-gather-4wb7b"] Oct 01 12:29:58 crc kubenswrapper[4669]: W1001 12:29:58.559992 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148ef6c2_2929_4c02_9f48_5f292bceba0c.slice/crio-d860b559f778dbaba765b0b777bbd6f980df068ae0db661d881c778e2ffa77c8 WatchSource:0}: Error finding container d860b559f778dbaba765b0b777bbd6f980df068ae0db661d881c778e2ffa77c8: Status 404 returned error can't find the container with id d860b559f778dbaba765b0b777bbd6f980df068ae0db661d881c778e2ffa77c8 Oct 01 12:29:59 crc kubenswrapper[4669]: I1001 12:29:59.452172 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" event={"ID":"148ef6c2-2929-4c02-9f48-5f292bceba0c","Type":"ContainerStarted","Data":"d860b559f778dbaba765b0b777bbd6f980df068ae0db661d881c778e2ffa77c8"} Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.184732 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5"] Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.188328 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.191059 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.192199 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.200064 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5"] Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.309206 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d903d668-ceb6-47b8-bcea-ac1d35a3c750-secret-volume\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.309780 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvng\" (UniqueName: \"kubernetes.io/projected/d903d668-ceb6-47b8-bcea-ac1d35a3c750-kube-api-access-shvng\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.309835 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d903d668-ceb6-47b8-bcea-ac1d35a3c750-config-volume\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.412597 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d903d668-ceb6-47b8-bcea-ac1d35a3c750-secret-volume\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.413012 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shvng\" (UniqueName: \"kubernetes.io/projected/d903d668-ceb6-47b8-bcea-ac1d35a3c750-kube-api-access-shvng\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.413054 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d903d668-ceb6-47b8-bcea-ac1d35a3c750-config-volume\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.414634 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d903d668-ceb6-47b8-bcea-ac1d35a3c750-config-volume\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.428322 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d903d668-ceb6-47b8-bcea-ac1d35a3c750-secret-volume\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.436670 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvng\" (UniqueName: \"kubernetes.io/projected/d903d668-ceb6-47b8-bcea-ac1d35a3c750-kube-api-access-shvng\") pod \"collect-profiles-29322030-z4bh5\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:00 crc kubenswrapper[4669]: I1001 12:30:00.520160 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:01 crc kubenswrapper[4669]: I1001 12:30:01.027010 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5"] Oct 01 12:30:01 crc kubenswrapper[4669]: I1001 12:30:01.485219 4669 generic.go:334] "Generic (PLEG): container finished" podID="d903d668-ceb6-47b8-bcea-ac1d35a3c750" containerID="48f174442a0c1961a30e829f242ec6d65b446bd838a4d9b8d2abee11f6708465" exitCode=0 Oct 01 12:30:01 crc kubenswrapper[4669]: I1001 12:30:01.485646 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" event={"ID":"d903d668-ceb6-47b8-bcea-ac1d35a3c750","Type":"ContainerDied","Data":"48f174442a0c1961a30e829f242ec6d65b446bd838a4d9b8d2abee11f6708465"} Oct 01 12:30:01 crc kubenswrapper[4669]: I1001 12:30:01.485687 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" event={"ID":"d903d668-ceb6-47b8-bcea-ac1d35a3c750","Type":"ContainerStarted","Data":"2d218375286bb27e0618be338176f5a8009ae67a03c864b7fc0556e112198e39"} Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.797830 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.951721 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d903d668-ceb6-47b8-bcea-ac1d35a3c750-config-volume\") pod \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.952312 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d903d668-ceb6-47b8-bcea-ac1d35a3c750-secret-volume\") pod \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.952372 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shvng\" (UniqueName: \"kubernetes.io/projected/d903d668-ceb6-47b8-bcea-ac1d35a3c750-kube-api-access-shvng\") pod \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\" (UID: \"d903d668-ceb6-47b8-bcea-ac1d35a3c750\") " Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.953367 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d903d668-ceb6-47b8-bcea-ac1d35a3c750-config-volume" (OuterVolumeSpecName: "config-volume") pod "d903d668-ceb6-47b8-bcea-ac1d35a3c750" (UID: "d903d668-ceb6-47b8-bcea-ac1d35a3c750"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.967266 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903d668-ceb6-47b8-bcea-ac1d35a3c750-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d903d668-ceb6-47b8-bcea-ac1d35a3c750" (UID: "d903d668-ceb6-47b8-bcea-ac1d35a3c750"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:30:03 crc kubenswrapper[4669]: I1001 12:30:03.968099 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d903d668-ceb6-47b8-bcea-ac1d35a3c750-kube-api-access-shvng" (OuterVolumeSpecName: "kube-api-access-shvng") pod "d903d668-ceb6-47b8-bcea-ac1d35a3c750" (UID: "d903d668-ceb6-47b8-bcea-ac1d35a3c750"). InnerVolumeSpecName "kube-api-access-shvng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.055266 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d903d668-ceb6-47b8-bcea-ac1d35a3c750-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.055306 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d903d668-ceb6-47b8-bcea-ac1d35a3c750-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.055317 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shvng\" (UniqueName: \"kubernetes.io/projected/d903d668-ceb6-47b8-bcea-ac1d35a3c750-kube-api-access-shvng\") on node \"crc\" DevicePath \"\"" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.532211 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" event={"ID":"148ef6c2-2929-4c02-9f48-5f292bceba0c","Type":"ContainerStarted","Data":"23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81"} Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.532740 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" event={"ID":"148ef6c2-2929-4c02-9f48-5f292bceba0c","Type":"ContainerStarted","Data":"76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de"} Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.534999 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" event={"ID":"d903d668-ceb6-47b8-bcea-ac1d35a3c750","Type":"ContainerDied","Data":"2d218375286bb27e0618be338176f5a8009ae67a03c864b7fc0556e112198e39"} Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.535033 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d218375286bb27e0618be338176f5a8009ae67a03c864b7fc0556e112198e39" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.535230 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-z4bh5" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.845559 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" podStartSLOduration=2.63343416 podStartE2EDuration="7.845531249s" podCreationTimestamp="2025-10-01 12:29:57 +0000 UTC" firstStartedPulling="2025-10-01 12:29:58.564450535 +0000 UTC m=+3689.664015522" lastFinishedPulling="2025-10-01 12:30:03.776547604 +0000 UTC m=+3694.876112611" observedRunningTime="2025-10-01 12:30:04.552723644 +0000 UTC m=+3695.652288621" watchObservedRunningTime="2025-10-01 12:30:04.845531249 +0000 UTC m=+3695.945096246" Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.892394 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl"] Oct 01 12:30:04 crc kubenswrapper[4669]: I1001 12:30:04.918433 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321985-zsfwl"] Oct 01 12:30:05 crc kubenswrapper[4669]: I1001 12:30:05.658565 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58dcf4ef-bc6a-4b6b-a976-370b66cc762c" path="/var/lib/kubelet/pods/58dcf4ef-bc6a-4b6b-a976-370b66cc762c/volumes" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.185843 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-t9qqv"] Oct 01 12:30:08 crc kubenswrapper[4669]: E1001 12:30:08.188921 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d903d668-ceb6-47b8-bcea-ac1d35a3c750" containerName="collect-profiles" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.188946 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="d903d668-ceb6-47b8-bcea-ac1d35a3c750" containerName="collect-profiles" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.189303 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="d903d668-ceb6-47b8-bcea-ac1d35a3c750" containerName="collect-profiles" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.190427 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.357806 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb586d3a-df66-4333-8e64-36038c49eee5-host\") pod \"crc-debug-t9qqv\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.358193 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjvr\" (UniqueName: \"kubernetes.io/projected/eb586d3a-df66-4333-8e64-36038c49eee5-kube-api-access-5cjvr\") pod \"crc-debug-t9qqv\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.460633 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjvr\" (UniqueName: \"kubernetes.io/projected/eb586d3a-df66-4333-8e64-36038c49eee5-kube-api-access-5cjvr\") pod \"crc-debug-t9qqv\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.460840 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb586d3a-df66-4333-8e64-36038c49eee5-host\") pod \"crc-debug-t9qqv\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.461039 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb586d3a-df66-4333-8e64-36038c49eee5-host\") pod \"crc-debug-t9qqv\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.481719 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjvr\" (UniqueName: \"kubernetes.io/projected/eb586d3a-df66-4333-8e64-36038c49eee5-kube-api-access-5cjvr\") pod \"crc-debug-t9qqv\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.514596 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:30:08 crc kubenswrapper[4669]: W1001 12:30:08.563904 4669 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb586d3a_df66_4333_8e64_36038c49eee5.slice/crio-755f7ce34b622db677f713ec68a74318e4a725bbbf95b42a75ff3d795e3c2c7f WatchSource:0}: Error finding container 755f7ce34b622db677f713ec68a74318e4a725bbbf95b42a75ff3d795e3c2c7f: Status 404 returned error can't find the container with id 755f7ce34b622db677f713ec68a74318e4a725bbbf95b42a75ff3d795e3c2c7f Oct 01 12:30:08 crc kubenswrapper[4669]: I1001 12:30:08.583555 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" event={"ID":"eb586d3a-df66-4333-8e64-36038c49eee5","Type":"ContainerStarted","Data":"755f7ce34b622db677f713ec68a74318e4a725bbbf95b42a75ff3d795e3c2c7f"} Oct 01 12:30:22 crc kubenswrapper[4669]: I1001 12:30:22.735122 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" event={"ID":"eb586d3a-df66-4333-8e64-36038c49eee5","Type":"ContainerStarted","Data":"1b51bf4ac61d05bfb17ea07656f0d5de53f5a199cb51eb1bf298a0ab3d74a6b4"} Oct 01 12:30:22 crc kubenswrapper[4669]: I1001 12:30:22.754716 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" podStartSLOduration=0.969309091 podStartE2EDuration="14.75469149s" podCreationTimestamp="2025-10-01 12:30:08 +0000 UTC" firstStartedPulling="2025-10-01 12:30:08.566887087 +0000 UTC m=+3699.666452064" lastFinishedPulling="2025-10-01 12:30:22.352269496 +0000 UTC m=+3713.451834463" observedRunningTime="2025-10-01 12:30:22.751612685 +0000 UTC m=+3713.851177662" watchObservedRunningTime="2025-10-01 12:30:22.75469149 +0000 UTC m=+3713.854256467" Oct 01 12:30:36 crc kubenswrapper[4669]: I1001 12:30:36.838521 4669 scope.go:117] "RemoveContainer" containerID="2bb432865e250c0206120efca081e75961ca357d25d2529f5ae2670db2ef1c14" Oct 01 12:31:20 crc kubenswrapper[4669]: I1001 12:31:20.560579 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d474ff6b-jd7xf_90e4ab06-115b-4efa-9a11-d16218dec9e0/barbican-api/0.log" Oct 01 12:31:20 crc kubenswrapper[4669]: I1001 12:31:20.617039 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d474ff6b-jd7xf_90e4ab06-115b-4efa-9a11-d16218dec9e0/barbican-api-log/0.log" Oct 01 12:31:20 crc kubenswrapper[4669]: I1001 12:31:20.840638 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b7c87b994-mshrj_14df8713-8fa5-482c-9280-af169783618d/barbican-keystone-listener-log/0.log" Oct 01 12:31:20 crc kubenswrapper[4669]: I1001 12:31:20.863236 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b7c87b994-mshrj_14df8713-8fa5-482c-9280-af169783618d/barbican-keystone-listener/0.log" Oct 01 12:31:21 crc kubenswrapper[4669]: I1001 12:31:21.077636 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b6d46dff-gdp9m_c2f34b06-3e5b-4380-8b38-4c9be553dc00/barbican-worker-log/0.log" Oct 01 12:31:21 crc kubenswrapper[4669]: I1001 12:31:21.088152 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b6d46dff-gdp9m_c2f34b06-3e5b-4380-8b38-4c9be553dc00/barbican-worker/0.log" Oct 01 12:31:21 crc kubenswrapper[4669]: I1001 12:31:21.357176 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6_b905607b-b7ef-420f-8c4e-603d4c788186/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.093970 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/ceilometer-central-agent/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.140318 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/proxy-httpd/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.151558 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/ceilometer-notification-agent/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.330941 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/sg-core/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.445247 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0ad8d85d-0bac-4894-91c9-ad9cd6d485ad/cinder-api/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.585254 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0ad8d85d-0bac-4894-91c9-ad9cd6d485ad/cinder-api-log/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.727295 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7dc50b83-702d-4bf7-bee7-87ead33a1faa/cinder-scheduler/0.log" Oct 01 12:31:22 crc kubenswrapper[4669]: I1001 12:31:22.847913 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7dc50b83-702d-4bf7-bee7-87ead33a1faa/probe/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.074403 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-knpjc_d753b30d-e1c5-45b9-8d78-767dd0cadaea/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.250781 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m55t4_bee90766-2c6f-4f88-a17d-33098d6599a9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.384665 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-tpr99_667c6c9f-b26e-4edb-b3f7-5d7241afb839/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.542376 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-bkd88_9d9999e8-41a9-4930-b113-7f135640c123/init/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.713402 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-bkd88_9d9999e8-41a9-4930-b113-7f135640c123/init/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.742950 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-bkd88_9d9999e8-41a9-4930-b113-7f135640c123/dnsmasq-dns/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.805660 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rvw82_261f1c48-3c07-495d-b916-861c2a1943d8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:23 crc kubenswrapper[4669]: I1001 12:31:23.991750 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0d4ea2b9-c6e4-4d27-866a-420be44d88f8/glance-httpd/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.109470 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0d4ea2b9-c6e4-4d27-866a-420be44d88f8/glance-log/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.213436 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0712d8cd-5673-4792-bafd-463179234f1d/glance-httpd/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.229407 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0712d8cd-5673-4792-bafd-463179234f1d/glance-log/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.414057 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-74d4dc5744-kqwsh_050a3c50-c6fb-4371-a309-af03e288d70d/horizon/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.659808 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2_bb0c4afd-aaf3-4875-94ec-668841ba1127/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.820856 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-blscr_b71b4047-5538-4132-9247-8b9b34e6979c/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:24 crc kubenswrapper[4669]: I1001 12:31:24.824237 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-74d4dc5744-kqwsh_050a3c50-c6fb-4371-a309-af03e288d70d/horizon-log/0.log" Oct 01 12:31:25 crc kubenswrapper[4669]: I1001 12:31:25.055356 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322001-ljw4f_6de4821a-ded1-483f-ade1-dda52ecc46ed/keystone-cron/0.log" Oct 01 12:31:25 crc kubenswrapper[4669]: I1001 12:31:25.136505 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d99769bb4-lq4fx_85b6fded-ed15-47f3-8e06-23511061f9b1/keystone-api/0.log" Oct 01 12:31:25 crc kubenswrapper[4669]: I1001 12:31:25.261507 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d4ec071d-763f-4513-8e0b-30fd6c1980d0/kube-state-metrics/0.log" Oct 01 12:31:25 crc kubenswrapper[4669]: I1001 12:31:25.345950 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc_9f57f089-5ea5-4b92-acbb-e14488a50253/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:25 crc kubenswrapper[4669]: I1001 12:31:25.801580 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75fdb4d7c7-7ltfb_74d7e57e-eda0-4134-bfd3-ed2c0e4826bf/neutron-api/0.log" Oct 01 12:31:25 crc kubenswrapper[4669]: I1001 12:31:25.853137 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75fdb4d7c7-7ltfb_74d7e57e-eda0-4134-bfd3-ed2c0e4826bf/neutron-httpd/0.log" Oct 01 12:31:26 crc kubenswrapper[4669]: I1001 12:31:26.094948 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm_09c6e280-6373-44f6-ad9b-fe24fe56e738/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:26 crc kubenswrapper[4669]: I1001 12:31:26.735934 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b39855ee-c66e-4f78-8128-a0149c9431da/nova-api-log/0.log" Oct 01 12:31:26 crc kubenswrapper[4669]: I1001 12:31:26.864859 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_35e646b8-72fe-4762-a24b-a74ddfb6be97/nova-cell0-conductor-conductor/0.log" Oct 01 12:31:26 crc kubenswrapper[4669]: I1001 12:31:26.995755 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b39855ee-c66e-4f78-8128-a0149c9431da/nova-api-api/0.log" Oct 01 12:31:27 crc kubenswrapper[4669]: I1001 12:31:27.194110 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2266ee85-7b31-496a-9dbd-6d69e282e847/nova-cell1-conductor-conductor/0.log" Oct 01 12:31:27 crc kubenswrapper[4669]: I1001 12:31:27.440045 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ef9631f5-92a1-4d2b-a5a6-25b60a609d61/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 12:31:27 crc kubenswrapper[4669]: I1001 12:31:27.672754 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zr89n_da3d07f3-8fb0-4ab3-a350-ad5b2a09af97/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:27 crc kubenswrapper[4669]: I1001 12:31:27.833478 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80be53d5-3338-467a-9be5-779722416d52/nova-metadata-log/0.log" Oct 01 12:31:28 crc kubenswrapper[4669]: I1001 12:31:28.479604 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_25ce3ac8-78ca-445e-acd1-995d99a5757a/nova-scheduler-scheduler/0.log" Oct 01 12:31:28 crc kubenswrapper[4669]: I1001 12:31:28.727527 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92bd05a8-df03-4e85-b32a-dc3ced713159/mysql-bootstrap/0.log" Oct 01 12:31:28 crc kubenswrapper[4669]: I1001 12:31:28.926363 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92bd05a8-df03-4e85-b32a-dc3ced713159/mysql-bootstrap/0.log" Oct 01 12:31:29 crc kubenswrapper[4669]: I1001 12:31:29.030667 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92bd05a8-df03-4e85-b32a-dc3ced713159/galera/0.log" Oct 01 12:31:29 crc kubenswrapper[4669]: I1001 12:31:29.313835 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_872d79b4-0374-4e78-98e4-32393e2f7f05/mysql-bootstrap/0.log" Oct 01 12:31:29 crc kubenswrapper[4669]: I1001 12:31:29.515890 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_872d79b4-0374-4e78-98e4-32393e2f7f05/mysql-bootstrap/0.log" Oct 01 12:31:29 crc kubenswrapper[4669]: I1001 12:31:29.791892 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80be53d5-3338-467a-9be5-779722416d52/nova-metadata-metadata/0.log" Oct 01 12:31:30 crc kubenswrapper[4669]: I1001 12:31:30.174380 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_872d79b4-0374-4e78-98e4-32393e2f7f05/galera/0.log" Oct 01 12:31:30 crc kubenswrapper[4669]: I1001 12:31:30.183558 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d68adea0-9ec1-4cc3-a727-a64457a70c9b/openstackclient/0.log" Oct 01 12:31:30 crc kubenswrapper[4669]: I1001 12:31:30.442721 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nsrfk_b77a4c9a-0426-40f6-a28a-7b985aebc4a2/openstack-network-exporter/0.log" Oct 01 12:31:30 crc kubenswrapper[4669]: I1001 12:31:30.705324 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovsdb-server-init/0.log" Oct 01 12:31:30 crc kubenswrapper[4669]: I1001 12:31:30.920093 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovsdb-server-init/0.log" Oct 01 12:31:30 crc kubenswrapper[4669]: I1001 12:31:30.928807 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovsdb-server/0.log" Oct 01 12:31:31 crc kubenswrapper[4669]: I1001 12:31:31.021680 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovs-vswitchd/0.log" Oct 01 12:31:31 crc kubenswrapper[4669]: I1001 12:31:31.224119 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-plhdj_c5ffe639-af06-4c4c-8794-a1becff8a692/ovn-controller/0.log" Oct 01 12:31:31 crc kubenswrapper[4669]: I1001 12:31:31.481029 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xmvtv_ffe0bf53-0bbb-45ac-96b3-fa31c365470c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:31 crc kubenswrapper[4669]: I1001 12:31:31.614629 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_83f3ffe1-ac22-408f-ab82-73d5cfd82953/openstack-network-exporter/0.log" Oct 01 12:31:31 crc kubenswrapper[4669]: I1001 12:31:31.863319 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:31:31 crc kubenswrapper[4669]: I1001 12:31:31.863799 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:31:32 crc kubenswrapper[4669]: I1001 12:31:32.347899 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_83f3ffe1-ac22-408f-ab82-73d5cfd82953/ovn-northd/0.log" Oct 01 12:31:32 crc kubenswrapper[4669]: I1001 12:31:32.407315 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76c8bfa8-2fca-4a74-85e8-f44af35d612f/openstack-network-exporter/0.log" Oct 01 12:31:32 crc kubenswrapper[4669]: I1001 12:31:32.607400 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76c8bfa8-2fca-4a74-85e8-f44af35d612f/ovsdbserver-nb/0.log" Oct 01 12:31:32 crc kubenswrapper[4669]: I1001 12:31:32.716499 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d13ad6e-a577-4f92-95ea-8ad268373774/openstack-network-exporter/0.log" Oct 01 12:31:32 crc kubenswrapper[4669]: I1001 12:31:32.883009 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d13ad6e-a577-4f92-95ea-8ad268373774/ovsdbserver-sb/0.log" Oct 01 12:31:33 crc kubenswrapper[4669]: I1001 12:31:33.037536 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-795f7c5588-ppc46_419df7bd-f554-4888-8a51-e885964ada7e/placement-api/0.log" Oct 01 12:31:33 crc kubenswrapper[4669]: I1001 12:31:33.430271 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e/setup-container/0.log" Oct 01 12:31:33 crc kubenswrapper[4669]: I1001 12:31:33.465788 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-795f7c5588-ppc46_419df7bd-f554-4888-8a51-e885964ada7e/placement-log/0.log" Oct 01 12:31:33 crc kubenswrapper[4669]: I1001 12:31:33.867123 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e/setup-container/0.log" Oct 01 12:31:34 crc kubenswrapper[4669]: I1001 12:31:34.053184 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e/rabbitmq/0.log" Oct 01 12:31:34 crc kubenswrapper[4669]: I1001 12:31:34.172332 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_352c2b88-bf96-4858-b166-d5655b36b2b0/setup-container/0.log" Oct 01 12:31:34 crc kubenswrapper[4669]: I1001 12:31:34.557710 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_352c2b88-bf96-4858-b166-d5655b36b2b0/setup-container/0.log" Oct 01 12:31:34 crc kubenswrapper[4669]: I1001 12:31:34.588340 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_352c2b88-bf96-4858-b166-d5655b36b2b0/rabbitmq/0.log" Oct 01 12:31:34 crc kubenswrapper[4669]: I1001 12:31:34.858060 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf_266686ce-e77a-4c6f-83d3-4d417e9a819f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:34 crc kubenswrapper[4669]: I1001 12:31:34.919609 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-k47v2_a422f4f8-7b2e-4f73-89e8-2659cda6effa/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:35 crc kubenswrapper[4669]: I1001 12:31:35.237416 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8_3f131ccb-5e9b-4097-8abe-f10d6f2c9b52/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:35 crc kubenswrapper[4669]: I1001 12:31:35.433155 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c4phj_0ffd3326-9422-4f07-b3e1-857324cff3e2/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:35 crc kubenswrapper[4669]: I1001 12:31:35.522290 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-l66p2_7c88952b-368f-4527-8916-b4877e5af1e3/ssh-known-hosts-edpm-deployment/0.log" Oct 01 12:31:35 crc kubenswrapper[4669]: I1001 12:31:35.790370 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c769b8b9-5svbp_fd677364-3064-4b42-9555-b640561fa4ed/proxy-server/0.log" Oct 01 12:31:35 crc kubenswrapper[4669]: I1001 12:31:35.940401 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c769b8b9-5svbp_fd677364-3064-4b42-9555-b640561fa4ed/proxy-httpd/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.049269 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pw6p8_9c77921b-54a6-48fd-a57c-4c14d17bf7d3/swift-ring-rebalance/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.217614 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-auditor/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.258626 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-reaper/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.460021 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-server/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.519385 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-replicator/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.537491 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-auditor/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.716676 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-replicator/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.779316 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-updater/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.782378 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-server/0.log" Oct 01 12:31:36 crc kubenswrapper[4669]: I1001 12:31:36.939677 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-auditor/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.023422 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-expirer/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.048532 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-replicator/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.200361 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-server/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.243512 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-updater/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.282056 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/rsync/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.435985 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/swift-recon-cron/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.618606 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl_d1966594-3c43-4ecf-a982-fc851d0bb43b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.867251 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fce73f67-b429-4b4a-b873-a45f92d104c7/tempest-tests-tempest-tests-runner/0.log" Oct 01 12:31:37 crc kubenswrapper[4669]: I1001 12:31:37.965697 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b2e123bd-d4e4-4b23-a8e0-07ea01e2c586/test-operator-logs-container/0.log" Oct 01 12:31:38 crc kubenswrapper[4669]: I1001 12:31:38.161818 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-krxsp_74c54aa8-261e-4bad-babf-2838c6b49114/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:31:44 crc kubenswrapper[4669]: I1001 12:31:44.691064 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0dda17c6-d274-4975-8796-deda5fd09e9c/memcached/0.log" Oct 01 12:32:01 crc kubenswrapper[4669]: I1001 12:32:01.863768 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:32:01 crc kubenswrapper[4669]: I1001 12:32:01.864677 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:32:31 crc kubenswrapper[4669]: I1001 12:32:31.863304 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:32:31 crc kubenswrapper[4669]: I1001 12:32:31.863928 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:32:31 crc kubenswrapper[4669]: I1001 12:32:31.863990 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:32:31 crc kubenswrapper[4669]: I1001 12:32:31.865128 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:32:31 crc kubenswrapper[4669]: I1001 12:32:31.865522 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" gracePeriod=600 Oct 01 12:32:31 crc kubenswrapper[4669]: E1001 12:32:31.995412 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:32:32 crc kubenswrapper[4669]: I1001 12:32:32.229503 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" exitCode=0 Oct 01 12:32:32 crc kubenswrapper[4669]: I1001 12:32:32.229567 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c"} Oct 01 12:32:32 crc kubenswrapper[4669]: I1001 12:32:32.229616 4669 scope.go:117] "RemoveContainer" containerID="784c7f7fc27aa6a93a1fa55ebe85565db9e1e1b5c58371a518406bc62cba9814" Oct 01 12:32:32 crc kubenswrapper[4669]: I1001 12:32:32.230414 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:32:32 crc kubenswrapper[4669]: E1001 12:32:32.230691 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:32:34 crc kubenswrapper[4669]: I1001 12:32:34.256501 4669 generic.go:334] "Generic (PLEG): container finished" podID="eb586d3a-df66-4333-8e64-36038c49eee5" containerID="1b51bf4ac61d05bfb17ea07656f0d5de53f5a199cb51eb1bf298a0ab3d74a6b4" exitCode=0 Oct 01 12:32:34 crc kubenswrapper[4669]: I1001 12:32:34.256624 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" event={"ID":"eb586d3a-df66-4333-8e64-36038c49eee5","Type":"ContainerDied","Data":"1b51bf4ac61d05bfb17ea07656f0d5de53f5a199cb51eb1bf298a0ab3d74a6b4"} Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.375842 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.424321 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-t9qqv"] Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.425251 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb586d3a-df66-4333-8e64-36038c49eee5-host\") pod \"eb586d3a-df66-4333-8e64-36038c49eee5\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.427071 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb586d3a-df66-4333-8e64-36038c49eee5-host" (OuterVolumeSpecName: "host") pod "eb586d3a-df66-4333-8e64-36038c49eee5" (UID: "eb586d3a-df66-4333-8e64-36038c49eee5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.428045 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb586d3a-df66-4333-8e64-36038c49eee5-host\") on node \"crc\" DevicePath \"\"" Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.449435 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-t9qqv"] Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.530025 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjvr\" (UniqueName: \"kubernetes.io/projected/eb586d3a-df66-4333-8e64-36038c49eee5-kube-api-access-5cjvr\") pod \"eb586d3a-df66-4333-8e64-36038c49eee5\" (UID: \"eb586d3a-df66-4333-8e64-36038c49eee5\") " Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.537845 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb586d3a-df66-4333-8e64-36038c49eee5-kube-api-access-5cjvr" (OuterVolumeSpecName: "kube-api-access-5cjvr") pod "eb586d3a-df66-4333-8e64-36038c49eee5" (UID: "eb586d3a-df66-4333-8e64-36038c49eee5"). InnerVolumeSpecName "kube-api-access-5cjvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.634030 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjvr\" (UniqueName: \"kubernetes.io/projected/eb586d3a-df66-4333-8e64-36038c49eee5-kube-api-access-5cjvr\") on node \"crc\" DevicePath \"\"" Oct 01 12:32:35 crc kubenswrapper[4669]: I1001 12:32:35.664928 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb586d3a-df66-4333-8e64-36038c49eee5" path="/var/lib/kubelet/pods/eb586d3a-df66-4333-8e64-36038c49eee5/volumes" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.277961 4669 scope.go:117] "RemoveContainer" containerID="1b51bf4ac61d05bfb17ea07656f0d5de53f5a199cb51eb1bf298a0ab3d74a6b4" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.278105 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-t9qqv" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.654159 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-zqr8v"] Oct 01 12:32:36 crc kubenswrapper[4669]: E1001 12:32:36.654774 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb586d3a-df66-4333-8e64-36038c49eee5" containerName="container-00" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.654802 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb586d3a-df66-4333-8e64-36038c49eee5" containerName="container-00" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.655104 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb586d3a-df66-4333-8e64-36038c49eee5" containerName="container-00" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.656061 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.756718 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-host\") pod \"crc-debug-zqr8v\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.756793 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4c4r\" (UniqueName: \"kubernetes.io/projected/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-kube-api-access-w4c4r\") pod \"crc-debug-zqr8v\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.858977 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-host\") pod \"crc-debug-zqr8v\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.859066 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4c4r\" (UniqueName: \"kubernetes.io/projected/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-kube-api-access-w4c4r\") pod \"crc-debug-zqr8v\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.859225 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-host\") pod \"crc-debug-zqr8v\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.887262 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4c4r\" (UniqueName: \"kubernetes.io/projected/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-kube-api-access-w4c4r\") pod \"crc-debug-zqr8v\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:36 crc kubenswrapper[4669]: I1001 12:32:36.977671 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:37 crc kubenswrapper[4669]: I1001 12:32:37.326547 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" event={"ID":"71681a5e-2e04-46c8-a0f8-fa52c2ae2624","Type":"ContainerStarted","Data":"ad68806eae3e6f36666b239f6eefd4283bac0a5eecdd1ca5335b2ba8737f926f"} Oct 01 12:32:37 crc kubenswrapper[4669]: I1001 12:32:37.326608 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" event={"ID":"71681a5e-2e04-46c8-a0f8-fa52c2ae2624","Type":"ContainerStarted","Data":"d9e9d9af2b337d8296011ed8d749b746c9690340817aa657c17bdce7d5d1330d"} Oct 01 12:32:37 crc kubenswrapper[4669]: I1001 12:32:37.347510 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" podStartSLOduration=1.347483532 podStartE2EDuration="1.347483532s" podCreationTimestamp="2025-10-01 12:32:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:32:37.344310805 +0000 UTC m=+3848.443875792" watchObservedRunningTime="2025-10-01 12:32:37.347483532 +0000 UTC m=+3848.447048509" Oct 01 12:32:38 crc kubenswrapper[4669]: E1001 12:32:38.055852 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71681a5e_2e04_46c8_a0f8_fa52c2ae2624.slice/crio-conmon-ad68806eae3e6f36666b239f6eefd4283bac0a5eecdd1ca5335b2ba8737f926f.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:32:38 crc kubenswrapper[4669]: I1001 12:32:38.337199 4669 generic.go:334] "Generic (PLEG): container finished" podID="71681a5e-2e04-46c8-a0f8-fa52c2ae2624" containerID="ad68806eae3e6f36666b239f6eefd4283bac0a5eecdd1ca5335b2ba8737f926f" exitCode=0 Oct 01 12:32:38 crc kubenswrapper[4669]: I1001 12:32:38.337255 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" event={"ID":"71681a5e-2e04-46c8-a0f8-fa52c2ae2624","Type":"ContainerDied","Data":"ad68806eae3e6f36666b239f6eefd4283bac0a5eecdd1ca5335b2ba8737f926f"} Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.458945 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.609166 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4c4r\" (UniqueName: \"kubernetes.io/projected/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-kube-api-access-w4c4r\") pod \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.609293 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-host\") pod \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\" (UID: \"71681a5e-2e04-46c8-a0f8-fa52c2ae2624\") " Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.610310 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-host" (OuterVolumeSpecName: "host") pod "71681a5e-2e04-46c8-a0f8-fa52c2ae2624" (UID: "71681a5e-2e04-46c8-a0f8-fa52c2ae2624"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.617009 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-kube-api-access-w4c4r" (OuterVolumeSpecName: "kube-api-access-w4c4r") pod "71681a5e-2e04-46c8-a0f8-fa52c2ae2624" (UID: "71681a5e-2e04-46c8-a0f8-fa52c2ae2624"). InnerVolumeSpecName "kube-api-access-w4c4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.712524 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4c4r\" (UniqueName: \"kubernetes.io/projected/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-kube-api-access-w4c4r\") on node \"crc\" DevicePath \"\"" Oct 01 12:32:39 crc kubenswrapper[4669]: I1001 12:32:39.713160 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71681a5e-2e04-46c8-a0f8-fa52c2ae2624-host\") on node \"crc\" DevicePath \"\"" Oct 01 12:32:40 crc kubenswrapper[4669]: I1001 12:32:40.365249 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" event={"ID":"71681a5e-2e04-46c8-a0f8-fa52c2ae2624","Type":"ContainerDied","Data":"d9e9d9af2b337d8296011ed8d749b746c9690340817aa657c17bdce7d5d1330d"} Oct 01 12:32:40 crc kubenswrapper[4669]: I1001 12:32:40.365305 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e9d9af2b337d8296011ed8d749b746c9690340817aa657c17bdce7d5d1330d" Oct 01 12:32:40 crc kubenswrapper[4669]: I1001 12:32:40.365342 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-zqr8v" Oct 01 12:32:45 crc kubenswrapper[4669]: I1001 12:32:45.248761 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-zqr8v"] Oct 01 12:32:45 crc kubenswrapper[4669]: I1001 12:32:45.257462 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-zqr8v"] Oct 01 12:32:45 crc kubenswrapper[4669]: I1001 12:32:45.658788 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71681a5e-2e04-46c8-a0f8-fa52c2ae2624" path="/var/lib/kubelet/pods/71681a5e-2e04-46c8-a0f8-fa52c2ae2624/volumes" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.464000 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-6rvpn"] Oct 01 12:32:46 crc kubenswrapper[4669]: E1001 12:32:46.464595 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71681a5e-2e04-46c8-a0f8-fa52c2ae2624" containerName="container-00" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.464613 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="71681a5e-2e04-46c8-a0f8-fa52c2ae2624" containerName="container-00" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.464926 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="71681a5e-2e04-46c8-a0f8-fa52c2ae2624" containerName="container-00" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.465733 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.573225 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-host\") pod \"crc-debug-6rvpn\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.573401 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8b57\" (UniqueName: \"kubernetes.io/projected/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-kube-api-access-x8b57\") pod \"crc-debug-6rvpn\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.645113 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:32:46 crc kubenswrapper[4669]: E1001 12:32:46.645561 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.675487 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-host\") pod \"crc-debug-6rvpn\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.675577 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8b57\" (UniqueName: \"kubernetes.io/projected/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-kube-api-access-x8b57\") pod \"crc-debug-6rvpn\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.675764 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-host\") pod \"crc-debug-6rvpn\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.699774 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8b57\" (UniqueName: \"kubernetes.io/projected/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-kube-api-access-x8b57\") pod \"crc-debug-6rvpn\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:46 crc kubenswrapper[4669]: I1001 12:32:46.789186 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:47 crc kubenswrapper[4669]: I1001 12:32:47.444658 4669 generic.go:334] "Generic (PLEG): container finished" podID="496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" containerID="4a0b3d36d33744510d1bd86101926481d242eb238be4aa3e9c2b23b1f5266fdf" exitCode=0 Oct 01 12:32:47 crc kubenswrapper[4669]: I1001 12:32:47.445242 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" event={"ID":"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5","Type":"ContainerDied","Data":"4a0b3d36d33744510d1bd86101926481d242eb238be4aa3e9c2b23b1f5266fdf"} Oct 01 12:32:47 crc kubenswrapper[4669]: I1001 12:32:47.445283 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" event={"ID":"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5","Type":"ContainerStarted","Data":"e6d159188e5343d04ba7a3c4ef133b6e30427e9772d1ad57f448ded78510dfb7"} Oct 01 12:32:47 crc kubenswrapper[4669]: I1001 12:32:47.492724 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-6rvpn"] Oct 01 12:32:47 crc kubenswrapper[4669]: I1001 12:32:47.505177 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d4b5q/crc-debug-6rvpn"] Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.575301 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.621913 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-host\") pod \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.622425 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8b57\" (UniqueName: \"kubernetes.io/projected/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-kube-api-access-x8b57\") pod \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\" (UID: \"496f1da0-d3ee-407e-b4d3-2d7cebaf67a5\") " Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.622450 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-host" (OuterVolumeSpecName: "host") pod "496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" (UID: "496f1da0-d3ee-407e-b4d3-2d7cebaf67a5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.623277 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-host\") on node \"crc\" DevicePath \"\"" Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.646538 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-kube-api-access-x8b57" (OuterVolumeSpecName: "kube-api-access-x8b57") pod "496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" (UID: "496f1da0-d3ee-407e-b4d3-2d7cebaf67a5"). InnerVolumeSpecName "kube-api-access-x8b57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:32:48 crc kubenswrapper[4669]: I1001 12:32:48.726589 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8b57\" (UniqueName: \"kubernetes.io/projected/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5-kube-api-access-x8b57\") on node \"crc\" DevicePath \"\"" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.382610 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/util/0.log" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.464367 4669 scope.go:117] "RemoveContainer" containerID="4a0b3d36d33744510d1bd86101926481d242eb238be4aa3e9c2b23b1f5266fdf" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.464446 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/crc-debug-6rvpn" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.652435 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/util/0.log" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.654292 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/pull/0.log" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.656161 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" path="/var/lib/kubelet/pods/496f1da0-d3ee-407e-b4d3-2d7cebaf67a5/volumes" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.671472 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/pull/0.log" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.860607 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/util/0.log" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.870838 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/pull/0.log" Oct 01 12:32:49 crc kubenswrapper[4669]: I1001 12:32:49.903667 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/extract/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.078869 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-9m4xq_aba4ff11-8110-4490-8a20-74c454be55d8/kube-rbac-proxy/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.192969 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-9m4xq_aba4ff11-8110-4490-8a20-74c454be55d8/manager/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.218929 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-cz2dp_7f0d56cb-1002-4345-903e-7e5979f47978/kube-rbac-proxy/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.358501 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-cz2dp_7f0d56cb-1002-4345-903e-7e5979f47978/manager/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.382710 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xwt57_4fa32a0a-904e-4b37-8ffb-a8c1d89df689/kube-rbac-proxy/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.426254 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xwt57_4fa32a0a-904e-4b37-8ffb-a8c1d89df689/manager/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.642464 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-dhqk6_e8163ded-d297-43ea-bde7-b5b90bdf1d17/kube-rbac-proxy/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.742286 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-dhqk6_e8163ded-d297-43ea-bde7-b5b90bdf1d17/manager/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.881436 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-2z84q_ebc9c519-e267-43d1-93b7-4cf38c84cc66/manager/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.918134 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-2z84q_ebc9c519-e267-43d1-93b7-4cf38c84cc66/kube-rbac-proxy/0.log" Oct 01 12:32:50 crc kubenswrapper[4669]: I1001 12:32:50.985852 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-5fxfq_1265856e-7658-44ca-b0a9-a0a5a42b8f5d/kube-rbac-proxy/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.178326 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-dv8s2_a887d629-1025-4da7-8c68-4b17c7205479/kube-rbac-proxy/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.202546 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-5fxfq_1265856e-7658-44ca-b0a9-a0a5a42b8f5d/manager/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.555662 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-dv8s2_a887d629-1025-4da7-8c68-4b17c7205479/manager/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.630198 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-lbq2b_863b3375-804f-4c8b-ba14-01230d822604/kube-rbac-proxy/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.712820 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-lbq2b_863b3375-804f-4c8b-ba14-01230d822604/manager/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.851452 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-j8t6g_400a027c-2dab-48e5-a109-e7b64d35807a/kube-rbac-proxy/0.log" Oct 01 12:32:51 crc kubenswrapper[4669]: I1001 12:32:51.977841 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-j8t6g_400a027c-2dab-48e5-a109-e7b64d35807a/manager/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.092183 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-djnfk_99c2ea9b-bcc7-4933-9614-94c32861e93c/kube-rbac-proxy/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.189528 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-djnfk_99c2ea9b-bcc7-4933-9614-94c32861e93c/manager/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.247009 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-fssdp_08897606-8ccd-4508-bf20-501855920e9e/kube-rbac-proxy/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.342996 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-fssdp_08897606-8ccd-4508-bf20-501855920e9e/manager/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.487556 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-86p66_da724701-02fc-439b-ba86-52bde8cb3003/kube-rbac-proxy/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.538251 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-86p66_da724701-02fc-439b-ba86-52bde8cb3003/manager/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.705286 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-7qf7f_8783a088-91c6-4f3c-bc34-b3d5a805ea07/kube-rbac-proxy/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.773121 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-7qf7f_8783a088-91c6-4f3c-bc34-b3d5a805ea07/manager/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.876175 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-nr258_707270cf-007e-4572-bae9-dd6b4c6e50d3/kube-rbac-proxy/0.log" Oct 01 12:32:52 crc kubenswrapper[4669]: I1001 12:32:52.960600 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-nr258_707270cf-007e-4572-bae9-dd6b4c6e50d3/manager/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.028969 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cfvgks_91df1fb9-8c91-4dde-9317-ff09df368c49/kube-rbac-proxy/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.087390 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cfvgks_91df1fb9-8c91-4dde-9317-ff09df368c49/manager/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.203741 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6599487588-n9gx7_964d3ab1-839a-49e6-b7c8-46056b070131/kube-rbac-proxy/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.456551 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76995989df-q6c9d_7fe95054-218b-47f4-a729-95a7f6b45a3d/kube-rbac-proxy/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.695943 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76995989df-q6c9d_7fe95054-218b-47f4-a729-95a7f6b45a3d/operator/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.733443 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fmzck_684a045b-062b-4989-85cf-f621d5c88f39/registry-server/0.log" Oct 01 12:32:53 crc kubenswrapper[4669]: I1001 12:32:53.809173 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-5mbbk_621748e9-0765-432f-bbc9-9bb62594eff6/kube-rbac-proxy/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.009654 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bc8gx_4f573e37-cb0a-4eba-9477-7c3d71276c86/kube-rbac-proxy/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.164835 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-5mbbk_621748e9-0765-432f-bbc9-9bb62594eff6/manager/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.176562 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bc8gx_4f573e37-cb0a-4eba-9477-7c3d71276c86/manager/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.309255 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-szrsc_2545705c-a102-47ca-b42b-119670c5be57/operator/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.394138 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6599487588-n9gx7_964d3ab1-839a-49e6-b7c8-46056b070131/manager/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.475468 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-2c2qw_e24ede8f-da24-4161-8621-d8b5abd08c1f/kube-rbac-proxy/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.503644 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-2c2qw_e24ede8f-da24-4161-8621-d8b5abd08c1f/manager/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.683042 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-p5qll_a2282a94-4700-4aae-8572-2104962decf8/manager/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.701360 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-p5qll_a2282a94-4700-4aae-8572-2104962decf8/kube-rbac-proxy/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.753316 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vxwz5_fb18dab5-d638-443a-bb62-6508de79bc0f/kube-rbac-proxy/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.883037 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vxwz5_fb18dab5-d638-443a-bb62-6508de79bc0f/manager/0.log" Oct 01 12:32:54 crc kubenswrapper[4669]: I1001 12:32:54.911163 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-lgckz_6d2b6087-c54d-4138-b162-e024a7a0e842/kube-rbac-proxy/0.log" Oct 01 12:32:55 crc kubenswrapper[4669]: I1001 12:32:55.043672 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-lgckz_6d2b6087-c54d-4138-b162-e024a7a0e842/manager/0.log" Oct 01 12:32:57 crc kubenswrapper[4669]: I1001 12:32:57.644708 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:32:57 crc kubenswrapper[4669]: E1001 12:32:57.645729 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:33:12 crc kubenswrapper[4669]: I1001 12:33:12.643889 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:33:12 crc kubenswrapper[4669]: E1001 12:33:12.644796 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:33:12 crc kubenswrapper[4669]: I1001 12:33:12.967715 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jfjm9_83f83da4-e855-4070-b524-4b7b789d0215/control-plane-machine-set-operator/0.log" Oct 01 12:33:13 crc kubenswrapper[4669]: I1001 12:33:13.124780 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7q7n5_e7445657-b8e4-4974-a680-7a05f0628fb7/kube-rbac-proxy/0.log" Oct 01 12:33:13 crc kubenswrapper[4669]: I1001 12:33:13.172449 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7q7n5_e7445657-b8e4-4974-a680-7a05f0628fb7/machine-api-operator/0.log" Oct 01 12:33:25 crc kubenswrapper[4669]: I1001 12:33:25.644716 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:33:25 crc kubenswrapper[4669]: E1001 12:33:25.646350 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:33:26 crc kubenswrapper[4669]: I1001 12:33:26.593482 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vczt8_2c0929fd-88f7-47d4-9975-54d4d6c606c0/cert-manager-controller/0.log" Oct 01 12:33:26 crc kubenswrapper[4669]: I1001 12:33:26.771182 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6kfl9_8f212951-fc37-4759-8933-2cee5f94845e/cert-manager-cainjector/0.log" Oct 01 12:33:26 crc kubenswrapper[4669]: I1001 12:33:26.857038 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-j5v46_bb59959e-c15d-466f-8809-66c2ae4c8a0b/cert-manager-webhook/0.log" Oct 01 12:33:37 crc kubenswrapper[4669]: I1001 12:33:37.644827 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:33:37 crc kubenswrapper[4669]: E1001 12:33:37.645892 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:33:40 crc kubenswrapper[4669]: I1001 12:33:40.136777 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-wlrv2_39594755-e0c6-4941-ac5c-b847a32459ff/nmstate-console-plugin/0.log" Oct 01 12:33:40 crc kubenswrapper[4669]: I1001 12:33:40.338272 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8p9bl_a2c1c01f-82d8-48e3-a140-14f363594918/nmstate-handler/0.log" Oct 01 12:33:40 crc kubenswrapper[4669]: I1001 12:33:40.373504 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-87fqs_10471e2d-ad87-44b7-af2e-b2209ae9337e/kube-rbac-proxy/0.log" Oct 01 12:33:40 crc kubenswrapper[4669]: I1001 12:33:40.472282 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-87fqs_10471e2d-ad87-44b7-af2e-b2209ae9337e/nmstate-metrics/0.log" Oct 01 12:33:40 crc kubenswrapper[4669]: I1001 12:33:40.603311 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-t4r7r_d776bb0e-3c68-4273-8aa2-e17ce4299e0c/nmstate-operator/0.log" Oct 01 12:33:40 crc kubenswrapper[4669]: I1001 12:33:40.673201 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-vbw79_4a99a9fe-0aaa-496b-97f2-e0964378b735/nmstate-webhook/0.log" Oct 01 12:33:49 crc kubenswrapper[4669]: I1001 12:33:49.651264 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:33:49 crc kubenswrapper[4669]: E1001 12:33:49.652847 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.112273 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-8kstm_05969ea4-e97c-4b66-aa70-c4909a58472b/kube-rbac-proxy/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.230071 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-8kstm_05969ea4-e97c-4b66-aa70-c4909a58472b/controller/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.363695 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.586912 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.593879 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.673239 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.701228 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.851974 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.888324 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.918765 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:33:56 crc kubenswrapper[4669]: I1001 12:33:56.924958 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.134909 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.172783 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.211378 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.216530 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/controller/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.457552 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/frr-metrics/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.466194 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/kube-rbac-proxy/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.500040 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/kube-rbac-proxy-frr/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.742143 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-pnqmb_b38f6785-4644-476d-9014-3ad44957a952/frr-k8s-webhook-server/0.log" Oct 01 12:33:57 crc kubenswrapper[4669]: I1001 12:33:57.753336 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/reloader/0.log" Oct 01 12:33:58 crc kubenswrapper[4669]: I1001 12:33:58.098577 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6774cc6d74-d656r_562d1f16-7779-4cfb-ae80-5bad719475d1/manager/0.log" Oct 01 12:33:58 crc kubenswrapper[4669]: I1001 12:33:58.265464 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c796f5894-wqh8w_c8793365-44bd-4d00-aa95-2d23bd134f23/webhook-server/0.log" Oct 01 12:33:58 crc kubenswrapper[4669]: I1001 12:33:58.375102 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t6f9w_fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230/kube-rbac-proxy/0.log" Oct 01 12:33:58 crc kubenswrapper[4669]: I1001 12:33:58.936995 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/frr/0.log" Oct 01 12:33:59 crc kubenswrapper[4669]: I1001 12:33:59.083452 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t6f9w_fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230/speaker/0.log" Oct 01 12:34:04 crc kubenswrapper[4669]: I1001 12:34:04.645243 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:34:04 crc kubenswrapper[4669]: E1001 12:34:04.646349 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:34:12 crc kubenswrapper[4669]: I1001 12:34:12.653494 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/util/0.log" Oct 01 12:34:12 crc kubenswrapper[4669]: I1001 12:34:12.879871 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/pull/0.log" Oct 01 12:34:12 crc kubenswrapper[4669]: I1001 12:34:12.895952 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/util/0.log" Oct 01 12:34:12 crc kubenswrapper[4669]: I1001 12:34:12.897497 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/pull/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.135401 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/util/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.144344 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/extract/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.145381 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/pull/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.313929 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-utilities/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.507831 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-content/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.519791 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-utilities/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.568763 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-content/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.753319 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-utilities/0.log" Oct 01 12:34:13 crc kubenswrapper[4669]: I1001 12:34:13.753465 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-content/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.038113 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-utilities/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.166448 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-content/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.199203 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/registry-server/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.218522 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-utilities/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.277568 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-content/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.489603 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-utilities/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.551110 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-content/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.808289 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/util/0.log" Oct 01 12:34:14 crc kubenswrapper[4669]: I1001 12:34:14.982555 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/util/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.044095 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/pull/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.073749 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/pull/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.089309 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/registry-server/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.283715 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/util/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.305561 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/pull/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.309034 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/extract/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.502525 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hg5t5_7693f22a-6758-4b18-8161-c5eb5e27a395/marketplace-operator/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.514477 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-utilities/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.645507 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:34:15 crc kubenswrapper[4669]: E1001 12:34:15.645882 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.770333 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-utilities/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.810959 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-content/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.813120 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-content/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.965835 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-utilities/0.log" Oct 01 12:34:15 crc kubenswrapper[4669]: I1001 12:34:15.991647 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-content/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.155446 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/registry-server/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.195783 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-utilities/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.334950 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-utilities/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.368791 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-content/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.384594 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-content/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.525026 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-utilities/0.log" Oct 01 12:34:16 crc kubenswrapper[4669]: I1001 12:34:16.574242 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-content/0.log" Oct 01 12:34:17 crc kubenswrapper[4669]: I1001 12:34:17.103187 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/registry-server/0.log" Oct 01 12:34:28 crc kubenswrapper[4669]: I1001 12:34:28.644504 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:34:28 crc kubenswrapper[4669]: E1001 12:34:28.645807 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:34:40 crc kubenswrapper[4669]: I1001 12:34:40.644792 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:34:40 crc kubenswrapper[4669]: E1001 12:34:40.646150 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:34:53 crc kubenswrapper[4669]: I1001 12:34:53.645348 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:34:53 crc kubenswrapper[4669]: E1001 12:34:53.646674 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:35:06 crc kubenswrapper[4669]: I1001 12:35:06.644857 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:35:06 crc kubenswrapper[4669]: E1001 12:35:06.647516 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:35:21 crc kubenswrapper[4669]: I1001 12:35:21.644848 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:35:21 crc kubenswrapper[4669]: E1001 12:35:21.646248 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.794472 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n7zht"] Oct 01 12:35:32 crc kubenswrapper[4669]: E1001 12:35:32.796113 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" containerName="container-00" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.796133 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" containerName="container-00" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.796404 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="496f1da0-d3ee-407e-b4d3-2d7cebaf67a5" containerName="container-00" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.807129 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.820500 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7zht"] Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.885917 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-utilities\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.886002 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8n79\" (UniqueName: \"kubernetes.io/projected/4ce12572-bdb3-435f-8b75-8f08f68a00fc-kube-api-access-d8n79\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.886247 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-catalog-content\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.988876 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-catalog-content\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.988957 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-utilities\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.988990 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8n79\" (UniqueName: \"kubernetes.io/projected/4ce12572-bdb3-435f-8b75-8f08f68a00fc-kube-api-access-d8n79\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.989612 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-catalog-content\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:32 crc kubenswrapper[4669]: I1001 12:35:32.989645 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-utilities\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:33 crc kubenswrapper[4669]: I1001 12:35:33.040646 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8n79\" (UniqueName: \"kubernetes.io/projected/4ce12572-bdb3-435f-8b75-8f08f68a00fc-kube-api-access-d8n79\") pod \"redhat-marketplace-n7zht\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:33 crc kubenswrapper[4669]: I1001 12:35:33.134204 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:33 crc kubenswrapper[4669]: I1001 12:35:33.649521 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:35:33 crc kubenswrapper[4669]: E1001 12:35:33.650357 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:35:33 crc kubenswrapper[4669]: I1001 12:35:33.872315 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7zht"] Oct 01 12:35:34 crc kubenswrapper[4669]: I1001 12:35:34.289994 4669 generic.go:334] "Generic (PLEG): container finished" podID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerID="27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb" exitCode=0 Oct 01 12:35:34 crc kubenswrapper[4669]: I1001 12:35:34.290071 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7zht" event={"ID":"4ce12572-bdb3-435f-8b75-8f08f68a00fc","Type":"ContainerDied","Data":"27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb"} Oct 01 12:35:34 crc kubenswrapper[4669]: I1001 12:35:34.290956 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7zht" event={"ID":"4ce12572-bdb3-435f-8b75-8f08f68a00fc","Type":"ContainerStarted","Data":"7faea7df0e8acdfd22b661cdb6e8d0569598933f5b23df9d88a6d5fb1479fc17"} Oct 01 12:35:34 crc kubenswrapper[4669]: I1001 12:35:34.295035 4669 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:35:36 crc kubenswrapper[4669]: I1001 12:35:36.320425 4669 generic.go:334] "Generic (PLEG): container finished" podID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerID="fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b" exitCode=0 Oct 01 12:35:36 crc kubenswrapper[4669]: I1001 12:35:36.320548 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7zht" event={"ID":"4ce12572-bdb3-435f-8b75-8f08f68a00fc","Type":"ContainerDied","Data":"fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b"} Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.335280 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7zht" event={"ID":"4ce12572-bdb3-435f-8b75-8f08f68a00fc","Type":"ContainerStarted","Data":"92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866"} Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.366369 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n7zht" podStartSLOduration=2.830082766 podStartE2EDuration="5.366114857s" podCreationTimestamp="2025-10-01 12:35:32 +0000 UTC" firstStartedPulling="2025-10-01 12:35:34.294711326 +0000 UTC m=+4025.394276303" lastFinishedPulling="2025-10-01 12:35:36.830743417 +0000 UTC m=+4027.930308394" observedRunningTime="2025-10-01 12:35:37.355113389 +0000 UTC m=+4028.454678376" watchObservedRunningTime="2025-10-01 12:35:37.366114857 +0000 UTC m=+4028.465679834" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.580875 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wr8pt"] Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.583282 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.605316 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wr8pt"] Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.654351 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrv6n\" (UniqueName: \"kubernetes.io/projected/21b105b8-b6f0-4288-ba52-7de7cfb204eb-kube-api-access-vrv6n\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.654683 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-catalog-content\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.654755 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-utilities\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.756450 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-utilities\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.756596 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrv6n\" (UniqueName: \"kubernetes.io/projected/21b105b8-b6f0-4288-ba52-7de7cfb204eb-kube-api-access-vrv6n\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.756724 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-catalog-content\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.757109 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-utilities\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.757358 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-catalog-content\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.783030 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrv6n\" (UniqueName: \"kubernetes.io/projected/21b105b8-b6f0-4288-ba52-7de7cfb204eb-kube-api-access-vrv6n\") pod \"certified-operators-wr8pt\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:37 crc kubenswrapper[4669]: I1001 12:35:37.908949 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:38 crc kubenswrapper[4669]: I1001 12:35:38.475401 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wr8pt"] Oct 01 12:35:39 crc kubenswrapper[4669]: I1001 12:35:39.369297 4669 generic.go:334] "Generic (PLEG): container finished" podID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerID="dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd" exitCode=0 Oct 01 12:35:39 crc kubenswrapper[4669]: I1001 12:35:39.369797 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerDied","Data":"dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd"} Oct 01 12:35:39 crc kubenswrapper[4669]: I1001 12:35:39.369833 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerStarted","Data":"64b808d997a06c3d32f1645700003b0bf3874d5e9758d30d180040b2c6478516"} Oct 01 12:35:41 crc kubenswrapper[4669]: I1001 12:35:41.395736 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerStarted","Data":"05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7"} Oct 01 12:35:42 crc kubenswrapper[4669]: I1001 12:35:42.417274 4669 generic.go:334] "Generic (PLEG): container finished" podID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerID="05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7" exitCode=0 Oct 01 12:35:42 crc kubenswrapper[4669]: I1001 12:35:42.418518 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerDied","Data":"05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7"} Oct 01 12:35:42 crc kubenswrapper[4669]: I1001 12:35:42.419657 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerStarted","Data":"1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560"} Oct 01 12:35:42 crc kubenswrapper[4669]: I1001 12:35:42.446538 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wr8pt" podStartSLOduration=2.728407607 podStartE2EDuration="5.446500529s" podCreationTimestamp="2025-10-01 12:35:37 +0000 UTC" firstStartedPulling="2025-10-01 12:35:39.377221851 +0000 UTC m=+4030.476786828" lastFinishedPulling="2025-10-01 12:35:42.095314773 +0000 UTC m=+4033.194879750" observedRunningTime="2025-10-01 12:35:42.435708847 +0000 UTC m=+4033.535273824" watchObservedRunningTime="2025-10-01 12:35:42.446500529 +0000 UTC m=+4033.546065506" Oct 01 12:35:43 crc kubenswrapper[4669]: I1001 12:35:43.145594 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:43 crc kubenswrapper[4669]: I1001 12:35:43.146167 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:43 crc kubenswrapper[4669]: I1001 12:35:43.222617 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:43 crc kubenswrapper[4669]: I1001 12:35:43.501830 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:44 crc kubenswrapper[4669]: I1001 12:35:44.355383 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7zht"] Oct 01 12:35:44 crc kubenswrapper[4669]: I1001 12:35:44.644339 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:35:44 crc kubenswrapper[4669]: E1001 12:35:44.644726 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:35:45 crc kubenswrapper[4669]: I1001 12:35:45.456412 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n7zht" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="registry-server" containerID="cri-o://92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866" gracePeriod=2 Oct 01 12:35:45 crc kubenswrapper[4669]: I1001 12:35:45.964999 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.095265 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8n79\" (UniqueName: \"kubernetes.io/projected/4ce12572-bdb3-435f-8b75-8f08f68a00fc-kube-api-access-d8n79\") pod \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.095498 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-catalog-content\") pod \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.095615 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-utilities\") pod \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\" (UID: \"4ce12572-bdb3-435f-8b75-8f08f68a00fc\") " Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.097256 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-utilities" (OuterVolumeSpecName: "utilities") pod "4ce12572-bdb3-435f-8b75-8f08f68a00fc" (UID: "4ce12572-bdb3-435f-8b75-8f08f68a00fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.124527 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ce12572-bdb3-435f-8b75-8f08f68a00fc" (UID: "4ce12572-bdb3-435f-8b75-8f08f68a00fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.202550 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.202605 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce12572-bdb3-435f-8b75-8f08f68a00fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.481173 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7zht" event={"ID":"4ce12572-bdb3-435f-8b75-8f08f68a00fc","Type":"ContainerDied","Data":"92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866"} Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.481286 4669 scope.go:117] "RemoveContainer" containerID="92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.481196 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7zht" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.481135 4669 generic.go:334] "Generic (PLEG): container finished" podID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerID="92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866" exitCode=0 Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.481528 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7zht" event={"ID":"4ce12572-bdb3-435f-8b75-8f08f68a00fc","Type":"ContainerDied","Data":"7faea7df0e8acdfd22b661cdb6e8d0569598933f5b23df9d88a6d5fb1479fc17"} Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.513143 4669 scope.go:117] "RemoveContainer" containerID="fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.829272 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce12572-bdb3-435f-8b75-8f08f68a00fc-kube-api-access-d8n79" (OuterVolumeSpecName: "kube-api-access-d8n79") pod "4ce12572-bdb3-435f-8b75-8f08f68a00fc" (UID: "4ce12572-bdb3-435f-8b75-8f08f68a00fc"). InnerVolumeSpecName "kube-api-access-d8n79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.861237 4669 scope.go:117] "RemoveContainer" containerID="27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.920692 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8n79\" (UniqueName: \"kubernetes.io/projected/4ce12572-bdb3-435f-8b75-8f08f68a00fc-kube-api-access-d8n79\") on node \"crc\" DevicePath \"\"" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.956930 4669 scope.go:117] "RemoveContainer" containerID="92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866" Oct 01 12:35:46 crc kubenswrapper[4669]: E1001 12:35:46.958113 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866\": container with ID starting with 92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866 not found: ID does not exist" containerID="92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.958182 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866"} err="failed to get container status \"92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866\": rpc error: code = NotFound desc = could not find container \"92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866\": container with ID starting with 92ae5c8aecdefc24e6f7940de297476e3d457de71c9f10be1d8afbe29506c866 not found: ID does not exist" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.958230 4669 scope.go:117] "RemoveContainer" containerID="fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b" Oct 01 12:35:46 crc kubenswrapper[4669]: E1001 12:35:46.959657 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b\": container with ID starting with fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b not found: ID does not exist" containerID="fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.959737 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b"} err="failed to get container status \"fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b\": rpc error: code = NotFound desc = could not find container \"fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b\": container with ID starting with fb3bf4588063bf1f9d5391adf469fbadfe146b6543c886d482c58b6d4822391b not found: ID does not exist" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.959787 4669 scope.go:117] "RemoveContainer" containerID="27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb" Oct 01 12:35:46 crc kubenswrapper[4669]: E1001 12:35:46.960411 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb\": container with ID starting with 27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb not found: ID does not exist" containerID="27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb" Oct 01 12:35:46 crc kubenswrapper[4669]: I1001 12:35:46.960497 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb"} err="failed to get container status \"27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb\": rpc error: code = NotFound desc = could not find container \"27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb\": container with ID starting with 27ee8c8d697071449dac83fe220c16241e89ece8576f4a6596a2dfbc855fa0cb not found: ID does not exist" Oct 01 12:35:47 crc kubenswrapper[4669]: I1001 12:35:47.146719 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7zht"] Oct 01 12:35:47 crc kubenswrapper[4669]: I1001 12:35:47.158170 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7zht"] Oct 01 12:35:47 crc kubenswrapper[4669]: I1001 12:35:47.661850 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" path="/var/lib/kubelet/pods/4ce12572-bdb3-435f-8b75-8f08f68a00fc/volumes" Oct 01 12:35:47 crc kubenswrapper[4669]: I1001 12:35:47.909769 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:47 crc kubenswrapper[4669]: I1001 12:35:47.912225 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:48 crc kubenswrapper[4669]: I1001 12:35:48.009468 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:48 crc kubenswrapper[4669]: I1001 12:35:48.563971 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:49 crc kubenswrapper[4669]: I1001 12:35:49.753897 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wr8pt"] Oct 01 12:35:50 crc kubenswrapper[4669]: I1001 12:35:50.541664 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wr8pt" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="registry-server" containerID="cri-o://1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560" gracePeriod=2 Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.062910 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.141891 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-catalog-content\") pod \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.142269 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-utilities\") pod \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.142315 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrv6n\" (UniqueName: \"kubernetes.io/projected/21b105b8-b6f0-4288-ba52-7de7cfb204eb-kube-api-access-vrv6n\") pod \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\" (UID: \"21b105b8-b6f0-4288-ba52-7de7cfb204eb\") " Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.143844 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-utilities" (OuterVolumeSpecName: "utilities") pod "21b105b8-b6f0-4288-ba52-7de7cfb204eb" (UID: "21b105b8-b6f0-4288-ba52-7de7cfb204eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.152613 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b105b8-b6f0-4288-ba52-7de7cfb204eb-kube-api-access-vrv6n" (OuterVolumeSpecName: "kube-api-access-vrv6n") pod "21b105b8-b6f0-4288-ba52-7de7cfb204eb" (UID: "21b105b8-b6f0-4288-ba52-7de7cfb204eb"). InnerVolumeSpecName "kube-api-access-vrv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.159023 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.159071 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrv6n\" (UniqueName: \"kubernetes.io/projected/21b105b8-b6f0-4288-ba52-7de7cfb204eb-kube-api-access-vrv6n\") on node \"crc\" DevicePath \"\"" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.191703 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21b105b8-b6f0-4288-ba52-7de7cfb204eb" (UID: "21b105b8-b6f0-4288-ba52-7de7cfb204eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.262772 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b105b8-b6f0-4288-ba52-7de7cfb204eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.561630 4669 generic.go:334] "Generic (PLEG): container finished" podID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerID="1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560" exitCode=0 Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.561708 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerDied","Data":"1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560"} Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.561767 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wr8pt" event={"ID":"21b105b8-b6f0-4288-ba52-7de7cfb204eb","Type":"ContainerDied","Data":"64b808d997a06c3d32f1645700003b0bf3874d5e9758d30d180040b2c6478516"} Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.561808 4669 scope.go:117] "RemoveContainer" containerID="1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.562158 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wr8pt" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.614457 4669 scope.go:117] "RemoveContainer" containerID="05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.621165 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wr8pt"] Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.630649 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wr8pt"] Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.658442 4669 scope.go:117] "RemoveContainer" containerID="dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.666294 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" path="/var/lib/kubelet/pods/21b105b8-b6f0-4288-ba52-7de7cfb204eb/volumes" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.689961 4669 scope.go:117] "RemoveContainer" containerID="1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560" Oct 01 12:35:51 crc kubenswrapper[4669]: E1001 12:35:51.690542 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560\": container with ID starting with 1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560 not found: ID does not exist" containerID="1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.690582 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560"} err="failed to get container status \"1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560\": rpc error: code = NotFound desc = could not find container \"1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560\": container with ID starting with 1ab92aaafd762e0ccee047d698cf98d199e28363e8916e37317f0c21befd7560 not found: ID does not exist" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.690611 4669 scope.go:117] "RemoveContainer" containerID="05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7" Oct 01 12:35:51 crc kubenswrapper[4669]: E1001 12:35:51.690948 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7\": container with ID starting with 05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7 not found: ID does not exist" containerID="05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.691007 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7"} err="failed to get container status \"05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7\": rpc error: code = NotFound desc = could not find container \"05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7\": container with ID starting with 05d8484e469258dcf7fab14b6a0081d4dbc9f877dc342de20fb4c880e1ca45e7 not found: ID does not exist" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.691057 4669 scope.go:117] "RemoveContainer" containerID="dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd" Oct 01 12:35:51 crc kubenswrapper[4669]: E1001 12:35:51.691683 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd\": container with ID starting with dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd not found: ID does not exist" containerID="dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd" Oct 01 12:35:51 crc kubenswrapper[4669]: I1001 12:35:51.691713 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd"} err="failed to get container status \"dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd\": rpc error: code = NotFound desc = could not find container \"dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd\": container with ID starting with dd1fdcaaf7407ceec931a3471ab1ab02b876a16c6fbf21af06fa964d22e5a1bd not found: ID does not exist" Oct 01 12:35:56 crc kubenswrapper[4669]: I1001 12:35:56.644292 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:35:56 crc kubenswrapper[4669]: E1001 12:35:56.645425 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:36:08 crc kubenswrapper[4669]: I1001 12:36:08.644162 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:36:08 crc kubenswrapper[4669]: E1001 12:36:08.645572 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:36:20 crc kubenswrapper[4669]: I1001 12:36:20.645152 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:36:20 crc kubenswrapper[4669]: E1001 12:36:20.646233 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:36:34 crc kubenswrapper[4669]: I1001 12:36:34.073733 4669 generic.go:334] "Generic (PLEG): container finished" podID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerID="76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de" exitCode=0 Oct 01 12:36:34 crc kubenswrapper[4669]: I1001 12:36:34.073807 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" event={"ID":"148ef6c2-2929-4c02-9f48-5f292bceba0c","Type":"ContainerDied","Data":"76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de"} Oct 01 12:36:34 crc kubenswrapper[4669]: I1001 12:36:34.075570 4669 scope.go:117] "RemoveContainer" containerID="76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de" Oct 01 12:36:34 crc kubenswrapper[4669]: I1001 12:36:34.644938 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:36:34 crc kubenswrapper[4669]: E1001 12:36:34.645491 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:36:34 crc kubenswrapper[4669]: I1001 12:36:34.921674 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d4b5q_must-gather-4wb7b_148ef6c2-2929-4c02-9f48-5f292bceba0c/gather/0.log" Oct 01 12:36:43 crc kubenswrapper[4669]: I1001 12:36:43.595803 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d4b5q/must-gather-4wb7b"] Oct 01 12:36:43 crc kubenswrapper[4669]: I1001 12:36:43.596862 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="copy" containerID="cri-o://23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81" gracePeriod=2 Oct 01 12:36:43 crc kubenswrapper[4669]: I1001 12:36:43.604793 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d4b5q/must-gather-4wb7b"] Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.066053 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d4b5q_must-gather-4wb7b_148ef6c2-2929-4c02-9f48-5f292bceba0c/copy/0.log" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.067836 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.189403 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d4b5q_must-gather-4wb7b_148ef6c2-2929-4c02-9f48-5f292bceba0c/copy/0.log" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.189959 4669 generic.go:334] "Generic (PLEG): container finished" podID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerID="23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81" exitCode=143 Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.190061 4669 scope.go:117] "RemoveContainer" containerID="23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.190140 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d4b5q/must-gather-4wb7b" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.219330 4669 scope.go:117] "RemoveContainer" containerID="76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.235271 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/148ef6c2-2929-4c02-9f48-5f292bceba0c-must-gather-output\") pod \"148ef6c2-2929-4c02-9f48-5f292bceba0c\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.235528 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndzms\" (UniqueName: \"kubernetes.io/projected/148ef6c2-2929-4c02-9f48-5f292bceba0c-kube-api-access-ndzms\") pod \"148ef6c2-2929-4c02-9f48-5f292bceba0c\" (UID: \"148ef6c2-2929-4c02-9f48-5f292bceba0c\") " Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.248570 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148ef6c2-2929-4c02-9f48-5f292bceba0c-kube-api-access-ndzms" (OuterVolumeSpecName: "kube-api-access-ndzms") pod "148ef6c2-2929-4c02-9f48-5f292bceba0c" (UID: "148ef6c2-2929-4c02-9f48-5f292bceba0c"). InnerVolumeSpecName "kube-api-access-ndzms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.317650 4669 scope.go:117] "RemoveContainer" containerID="23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81" Oct 01 12:36:44 crc kubenswrapper[4669]: E1001 12:36:44.324262 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81\": container with ID starting with 23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81 not found: ID does not exist" containerID="23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.324335 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81"} err="failed to get container status \"23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81\": rpc error: code = NotFound desc = could not find container \"23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81\": container with ID starting with 23f11891c7e693583355778f27a4d69d22cc805d9ae16d9927014d95ada90d81 not found: ID does not exist" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.324373 4669 scope.go:117] "RemoveContainer" containerID="76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de" Oct 01 12:36:44 crc kubenswrapper[4669]: E1001 12:36:44.325628 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de\": container with ID starting with 76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de not found: ID does not exist" containerID="76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.325670 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de"} err="failed to get container status \"76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de\": rpc error: code = NotFound desc = could not find container \"76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de\": container with ID starting with 76104cc1b5cb5911e1e22e6528c26a33e39f275987cf7802d7cb29f0fbe922de not found: ID does not exist" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.337813 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndzms\" (UniqueName: \"kubernetes.io/projected/148ef6c2-2929-4c02-9f48-5f292bceba0c-kube-api-access-ndzms\") on node \"crc\" DevicePath \"\"" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.440613 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148ef6c2-2929-4c02-9f48-5f292bceba0c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "148ef6c2-2929-4c02-9f48-5f292bceba0c" (UID: "148ef6c2-2929-4c02-9f48-5f292bceba0c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:36:44 crc kubenswrapper[4669]: I1001 12:36:44.543102 4669 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/148ef6c2-2929-4c02-9f48-5f292bceba0c-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 12:36:45 crc kubenswrapper[4669]: I1001 12:36:45.655705 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" path="/var/lib/kubelet/pods/148ef6c2-2929-4c02-9f48-5f292bceba0c/volumes" Oct 01 12:36:49 crc kubenswrapper[4669]: I1001 12:36:49.655439 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:36:49 crc kubenswrapper[4669]: E1001 12:36:49.656650 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:37:01 crc kubenswrapper[4669]: I1001 12:37:01.644781 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:37:01 crc kubenswrapper[4669]: E1001 12:37:01.645896 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.767991 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hxvt9/must-gather-ps8sr"] Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769269 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="copy" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769286 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="copy" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769308 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="registry-server" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769315 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="registry-server" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769339 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="extract-content" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769346 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="extract-content" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769363 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="extract-utilities" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769371 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="extract-utilities" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769384 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="extract-utilities" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769391 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="extract-utilities" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769408 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="extract-content" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769415 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="extract-content" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769428 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="registry-server" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769436 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="registry-server" Oct 01 12:37:07 crc kubenswrapper[4669]: E1001 12:37:07.769449 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="gather" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769456 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="gather" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769675 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="copy" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769720 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b105b8-b6f0-4288-ba52-7de7cfb204eb" containerName="registry-server" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769733 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="148ef6c2-2929-4c02-9f48-5f292bceba0c" containerName="gather" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.769741 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce12572-bdb3-435f-8b75-8f08f68a00fc" containerName="registry-server" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.770937 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.779378 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hxvt9"/"openshift-service-ca.crt" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.779501 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hxvt9"/"default-dockercfg-cpxpl" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.779923 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hxvt9"/"kube-root-ca.crt" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.796734 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hxvt9/must-gather-ps8sr"] Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.902736 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4dx6\" (UniqueName: \"kubernetes.io/projected/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-kube-api-access-n4dx6\") pod \"must-gather-ps8sr\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:07 crc kubenswrapper[4669]: I1001 12:37:07.902816 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-must-gather-output\") pod \"must-gather-ps8sr\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:08 crc kubenswrapper[4669]: I1001 12:37:08.004813 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4dx6\" (UniqueName: \"kubernetes.io/projected/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-kube-api-access-n4dx6\") pod \"must-gather-ps8sr\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:08 crc kubenswrapper[4669]: I1001 12:37:08.004888 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-must-gather-output\") pod \"must-gather-ps8sr\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:08 crc kubenswrapper[4669]: I1001 12:37:08.005403 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-must-gather-output\") pod \"must-gather-ps8sr\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:08 crc kubenswrapper[4669]: I1001 12:37:08.028311 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4dx6\" (UniqueName: \"kubernetes.io/projected/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-kube-api-access-n4dx6\") pod \"must-gather-ps8sr\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:08 crc kubenswrapper[4669]: I1001 12:37:08.095441 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:37:08 crc kubenswrapper[4669]: I1001 12:37:08.612038 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hxvt9/must-gather-ps8sr"] Oct 01 12:37:09 crc kubenswrapper[4669]: I1001 12:37:09.533533 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" event={"ID":"0a238c08-a2bf-432a-967c-79e1b4dcbfa6","Type":"ContainerStarted","Data":"6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80"} Oct 01 12:37:09 crc kubenswrapper[4669]: I1001 12:37:09.536097 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" event={"ID":"0a238c08-a2bf-432a-967c-79e1b4dcbfa6","Type":"ContainerStarted","Data":"7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d"} Oct 01 12:37:09 crc kubenswrapper[4669]: I1001 12:37:09.536211 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" event={"ID":"0a238c08-a2bf-432a-967c-79e1b4dcbfa6","Type":"ContainerStarted","Data":"9508da96774455930c9f01490c69d283c733f06498c28099cae02c12dd7c34fb"} Oct 01 12:37:11 crc kubenswrapper[4669]: E1001 12:37:11.978120 4669 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:48318->38.102.83.82:43797: write tcp 38.102.83.82:48318->38.102.83.82:43797: write: broken pipe Oct 01 12:37:12 crc kubenswrapper[4669]: E1001 12:37:12.277178 4669 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:48416->38.102.83.82:43797: write tcp 38.102.83.82:48416->38.102.83.82:43797: write: broken pipe Oct 01 12:37:12 crc kubenswrapper[4669]: I1001 12:37:12.646047 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:37:12 crc kubenswrapper[4669]: E1001 12:37:12.646397 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:37:12 crc kubenswrapper[4669]: I1001 12:37:12.988570 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" podStartSLOduration=5.988547907 podStartE2EDuration="5.988547907s" podCreationTimestamp="2025-10-01 12:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:37:09.558845057 +0000 UTC m=+4120.658410054" watchObservedRunningTime="2025-10-01 12:37:12.988547907 +0000 UTC m=+4124.088112884" Oct 01 12:37:12 crc kubenswrapper[4669]: I1001 12:37:12.994914 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-9qrt9"] Oct 01 12:37:12 crc kubenswrapper[4669]: I1001 12:37:12.996376 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.127538 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgtmm\" (UniqueName: \"kubernetes.io/projected/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-kube-api-access-mgtmm\") pod \"crc-debug-9qrt9\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.127953 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-host\") pod \"crc-debug-9qrt9\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.230245 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-host\") pod \"crc-debug-9qrt9\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.230379 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgtmm\" (UniqueName: \"kubernetes.io/projected/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-kube-api-access-mgtmm\") pod \"crc-debug-9qrt9\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.230465 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-host\") pod \"crc-debug-9qrt9\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.251906 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgtmm\" (UniqueName: \"kubernetes.io/projected/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-kube-api-access-mgtmm\") pod \"crc-debug-9qrt9\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:13 crc kubenswrapper[4669]: I1001 12:37:13.357470 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:37:14 crc kubenswrapper[4669]: I1001 12:37:14.589451 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" event={"ID":"83cfee95-2c7d-4802-bfe5-5acf65c89d9d","Type":"ContainerStarted","Data":"b719e7f6d9256d982311a31312976a85a7e2e35eacc733dbf89ada36a1770f6e"} Oct 01 12:37:14 crc kubenswrapper[4669]: I1001 12:37:14.590345 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" event={"ID":"83cfee95-2c7d-4802-bfe5-5acf65c89d9d","Type":"ContainerStarted","Data":"5e6751b3ca4988d5e669c3f3a525b129f43e7a993082778212d2cb1a42e0f994"} Oct 01 12:37:14 crc kubenswrapper[4669]: I1001 12:37:14.617466 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" podStartSLOduration=2.617434739 podStartE2EDuration="2.617434739s" podCreationTimestamp="2025-10-01 12:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:37:14.607381524 +0000 UTC m=+4125.706946491" watchObservedRunningTime="2025-10-01 12:37:14.617434739 +0000 UTC m=+4125.716999726" Oct 01 12:37:26 crc kubenswrapper[4669]: I1001 12:37:26.644848 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:37:26 crc kubenswrapper[4669]: E1001 12:37:26.646024 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:37:40 crc kubenswrapper[4669]: I1001 12:37:40.644751 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:37:41 crc kubenswrapper[4669]: I1001 12:37:41.914530 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"229df817fe94baa1aade7478bd01c70efd0a8f5ad4457de01e0b88aee5ac9fa9"} Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.047369 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-597fb"] Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.056874 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.086167 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-597fb"] Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.201159 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-catalog-content\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.201674 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sm46\" (UniqueName: \"kubernetes.io/projected/8b13403b-5046-4673-be13-76e3bfef960e-kube-api-access-5sm46\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.201813 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-utilities\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.305777 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sm46\" (UniqueName: \"kubernetes.io/projected/8b13403b-5046-4673-be13-76e3bfef960e-kube-api-access-5sm46\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.305871 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-utilities\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.305929 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-catalog-content\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.306634 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-catalog-content\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.306938 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-utilities\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.344148 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sm46\" (UniqueName: \"kubernetes.io/projected/8b13403b-5046-4673-be13-76e3bfef960e-kube-api-access-5sm46\") pod \"community-operators-597fb\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:52 crc kubenswrapper[4669]: I1001 12:37:52.399268 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:37:53 crc kubenswrapper[4669]: I1001 12:37:53.117155 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-597fb"] Oct 01 12:37:54 crc kubenswrapper[4669]: I1001 12:37:54.053631 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b13403b-5046-4673-be13-76e3bfef960e" containerID="fa76ed85b865e5b195d5e0bedb673fe02229a02280ffe8411ccf22dd46396f29" exitCode=0 Oct 01 12:37:54 crc kubenswrapper[4669]: I1001 12:37:54.053747 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-597fb" event={"ID":"8b13403b-5046-4673-be13-76e3bfef960e","Type":"ContainerDied","Data":"fa76ed85b865e5b195d5e0bedb673fe02229a02280ffe8411ccf22dd46396f29"} Oct 01 12:37:54 crc kubenswrapper[4669]: I1001 12:37:54.054465 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-597fb" event={"ID":"8b13403b-5046-4673-be13-76e3bfef960e","Type":"ContainerStarted","Data":"a21ea1d560e146aba6b53629c2a485c9ee63062d4e5b807cd2b7f558bba7c4c2"} Oct 01 12:37:56 crc kubenswrapper[4669]: I1001 12:37:56.078574 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b13403b-5046-4673-be13-76e3bfef960e" containerID="b209e7193df754c248524b0ea0a362565649087e86334d04b9f9b2a875321d42" exitCode=0 Oct 01 12:37:56 crc kubenswrapper[4669]: I1001 12:37:56.080859 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-597fb" event={"ID":"8b13403b-5046-4673-be13-76e3bfef960e","Type":"ContainerDied","Data":"b209e7193df754c248524b0ea0a362565649087e86334d04b9f9b2a875321d42"} Oct 01 12:37:58 crc kubenswrapper[4669]: I1001 12:37:58.115142 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-597fb" event={"ID":"8b13403b-5046-4673-be13-76e3bfef960e","Type":"ContainerStarted","Data":"abf4df06cce6e3eae006b0c189a004b3769e9035bb84d0e08e493f0dab64fe20"} Oct 01 12:37:58 crc kubenswrapper[4669]: I1001 12:37:58.142445 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-597fb" podStartSLOduration=3.116185908 podStartE2EDuration="6.142422386s" podCreationTimestamp="2025-10-01 12:37:52 +0000 UTC" firstStartedPulling="2025-10-01 12:37:54.056290085 +0000 UTC m=+4165.155855062" lastFinishedPulling="2025-10-01 12:37:57.082526563 +0000 UTC m=+4168.182091540" observedRunningTime="2025-10-01 12:37:58.138260503 +0000 UTC m=+4169.237825480" watchObservedRunningTime="2025-10-01 12:37:58.142422386 +0000 UTC m=+4169.241987363" Oct 01 12:38:02 crc kubenswrapper[4669]: I1001 12:38:02.399656 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:38:02 crc kubenswrapper[4669]: I1001 12:38:02.402566 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:38:02 crc kubenswrapper[4669]: I1001 12:38:02.463953 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:38:03 crc kubenswrapper[4669]: I1001 12:38:03.249096 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:38:03 crc kubenswrapper[4669]: I1001 12:38:03.307843 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-597fb"] Oct 01 12:38:05 crc kubenswrapper[4669]: I1001 12:38:05.203215 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-597fb" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="registry-server" containerID="cri-o://abf4df06cce6e3eae006b0c189a004b3769e9035bb84d0e08e493f0dab64fe20" gracePeriod=2 Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.218062 4669 generic.go:334] "Generic (PLEG): container finished" podID="8b13403b-5046-4673-be13-76e3bfef960e" containerID="abf4df06cce6e3eae006b0c189a004b3769e9035bb84d0e08e493f0dab64fe20" exitCode=0 Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.218130 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-597fb" event={"ID":"8b13403b-5046-4673-be13-76e3bfef960e","Type":"ContainerDied","Data":"abf4df06cce6e3eae006b0c189a004b3769e9035bb84d0e08e493f0dab64fe20"} Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.218919 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-597fb" event={"ID":"8b13403b-5046-4673-be13-76e3bfef960e","Type":"ContainerDied","Data":"a21ea1d560e146aba6b53629c2a485c9ee63062d4e5b807cd2b7f558bba7c4c2"} Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.218948 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21ea1d560e146aba6b53629c2a485c9ee63062d4e5b807cd2b7f558bba7c4c2" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.236003 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.384300 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-catalog-content\") pod \"8b13403b-5046-4673-be13-76e3bfef960e\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.384370 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sm46\" (UniqueName: \"kubernetes.io/projected/8b13403b-5046-4673-be13-76e3bfef960e-kube-api-access-5sm46\") pod \"8b13403b-5046-4673-be13-76e3bfef960e\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.384431 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-utilities\") pod \"8b13403b-5046-4673-be13-76e3bfef960e\" (UID: \"8b13403b-5046-4673-be13-76e3bfef960e\") " Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.385566 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-utilities" (OuterVolumeSpecName: "utilities") pod "8b13403b-5046-4673-be13-76e3bfef960e" (UID: "8b13403b-5046-4673-be13-76e3bfef960e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.409448 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b13403b-5046-4673-be13-76e3bfef960e-kube-api-access-5sm46" (OuterVolumeSpecName: "kube-api-access-5sm46") pod "8b13403b-5046-4673-be13-76e3bfef960e" (UID: "8b13403b-5046-4673-be13-76e3bfef960e"). InnerVolumeSpecName "kube-api-access-5sm46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.444573 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b13403b-5046-4673-be13-76e3bfef960e" (UID: "8b13403b-5046-4673-be13-76e3bfef960e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.487439 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.487510 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sm46\" (UniqueName: \"kubernetes.io/projected/8b13403b-5046-4673-be13-76e3bfef960e-kube-api-access-5sm46\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:06 crc kubenswrapper[4669]: I1001 12:38:06.487525 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b13403b-5046-4673-be13-76e3bfef960e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:38:07 crc kubenswrapper[4669]: I1001 12:38:07.228628 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-597fb" Oct 01 12:38:07 crc kubenswrapper[4669]: I1001 12:38:07.284656 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-597fb"] Oct 01 12:38:07 crc kubenswrapper[4669]: I1001 12:38:07.298229 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-597fb"] Oct 01 12:38:07 crc kubenswrapper[4669]: E1001 12:38:07.461989 4669 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b13403b_5046_4673_be13_76e3bfef960e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b13403b_5046_4673_be13_76e3bfef960e.slice/crio-a21ea1d560e146aba6b53629c2a485c9ee63062d4e5b807cd2b7f558bba7c4c2\": RecentStats: unable to find data in memory cache]" Oct 01 12:38:07 crc kubenswrapper[4669]: I1001 12:38:07.656732 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b13403b-5046-4673-be13-76e3bfef960e" path="/var/lib/kubelet/pods/8b13403b-5046-4673-be13-76e3bfef960e/volumes" Oct 01 12:38:32 crc kubenswrapper[4669]: I1001 12:38:32.945116 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d474ff6b-jd7xf_90e4ab06-115b-4efa-9a11-d16218dec9e0/barbican-api-log/0.log" Oct 01 12:38:32 crc kubenswrapper[4669]: I1001 12:38:32.947462 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84d474ff6b-jd7xf_90e4ab06-115b-4efa-9a11-d16218dec9e0/barbican-api/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.167454 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b7c87b994-mshrj_14df8713-8fa5-482c-9280-af169783618d/barbican-keystone-listener/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.229897 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b7c87b994-mshrj_14df8713-8fa5-482c-9280-af169783618d/barbican-keystone-listener-log/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.415831 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b6d46dff-gdp9m_c2f34b06-3e5b-4380-8b38-4c9be553dc00/barbican-worker/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.486013 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b6d46dff-gdp9m_c2f34b06-3e5b-4380-8b38-4c9be553dc00/barbican-worker-log/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.700753 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8w8x6_b905607b-b7ef-420f-8c4e-603d4c788186/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.932361 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/ceilometer-notification-agent/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.980736 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/proxy-httpd/0.log" Oct 01 12:38:33 crc kubenswrapper[4669]: I1001 12:38:33.981105 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/ceilometer-central-agent/0.log" Oct 01 12:38:34 crc kubenswrapper[4669]: I1001 12:38:34.109381 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8d57d6c0-5ebc-48cd-8be3-cc9bc2f65272/sg-core/0.log" Oct 01 12:38:34 crc kubenswrapper[4669]: I1001 12:38:34.235215 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0ad8d85d-0bac-4894-91c9-ad9cd6d485ad/cinder-api/0.log" Oct 01 12:38:34 crc kubenswrapper[4669]: I1001 12:38:34.947655 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0ad8d85d-0bac-4894-91c9-ad9cd6d485ad/cinder-api-log/0.log" Oct 01 12:38:35 crc kubenswrapper[4669]: I1001 12:38:35.005373 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7dc50b83-702d-4bf7-bee7-87ead33a1faa/cinder-scheduler/0.log" Oct 01 12:38:35 crc kubenswrapper[4669]: I1001 12:38:35.186165 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7dc50b83-702d-4bf7-bee7-87ead33a1faa/probe/0.log" Oct 01 12:38:35 crc kubenswrapper[4669]: I1001 12:38:35.234236 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-knpjc_d753b30d-e1c5-45b9-8d78-767dd0cadaea/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:35 crc kubenswrapper[4669]: I1001 12:38:35.452528 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m55t4_bee90766-2c6f-4f88-a17d-33098d6599a9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:35 crc kubenswrapper[4669]: I1001 12:38:35.642915 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-tpr99_667c6c9f-b26e-4edb-b3f7-5d7241afb839/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:35 crc kubenswrapper[4669]: I1001 12:38:35.798288 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-bkd88_9d9999e8-41a9-4930-b113-7f135640c123/init/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.024645 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-bkd88_9d9999e8-41a9-4930-b113-7f135640c123/init/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.082931 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-bkd88_9d9999e8-41a9-4930-b113-7f135640c123/dnsmasq-dns/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.259218 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rvw82_261f1c48-3c07-495d-b916-861c2a1943d8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.305363 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0d4ea2b9-c6e4-4d27-866a-420be44d88f8/glance-httpd/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.325829 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0d4ea2b9-c6e4-4d27-866a-420be44d88f8/glance-log/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.538711 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0712d8cd-5673-4792-bafd-463179234f1d/glance-httpd/0.log" Oct 01 12:38:36 crc kubenswrapper[4669]: I1001 12:38:36.552727 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0712d8cd-5673-4792-bafd-463179234f1d/glance-log/0.log" Oct 01 12:38:37 crc kubenswrapper[4669]: I1001 12:38:37.245751 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jh9h2_bb0c4afd-aaf3-4875-94ec-668841ba1127/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:37 crc kubenswrapper[4669]: I1001 12:38:37.445651 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-74d4dc5744-kqwsh_050a3c50-c6fb-4371-a309-af03e288d70d/horizon/0.log" Oct 01 12:38:37 crc kubenswrapper[4669]: I1001 12:38:37.633113 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-blscr_b71b4047-5538-4132-9247-8b9b34e6979c/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:37 crc kubenswrapper[4669]: I1001 12:38:37.862888 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-74d4dc5744-kqwsh_050a3c50-c6fb-4371-a309-af03e288d70d/horizon-log/0.log" Oct 01 12:38:37 crc kubenswrapper[4669]: I1001 12:38:37.960336 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322001-ljw4f_6de4821a-ded1-483f-ade1-dda52ecc46ed/keystone-cron/0.log" Oct 01 12:38:37 crc kubenswrapper[4669]: I1001 12:38:37.995988 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d99769bb4-lq4fx_85b6fded-ed15-47f3-8e06-23511061f9b1/keystone-api/0.log" Oct 01 12:38:38 crc kubenswrapper[4669]: I1001 12:38:38.148986 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d4ec071d-763f-4513-8e0b-30fd6c1980d0/kube-state-metrics/0.log" Oct 01 12:38:38 crc kubenswrapper[4669]: I1001 12:38:38.221463 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9vqjc_9f57f089-5ea5-4b92-acbb-e14488a50253/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:38 crc kubenswrapper[4669]: I1001 12:38:38.754897 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75fdb4d7c7-7ltfb_74d7e57e-eda0-4134-bfd3-ed2c0e4826bf/neutron-api/0.log" Oct 01 12:38:38 crc kubenswrapper[4669]: I1001 12:38:38.830302 4669 scope.go:117] "RemoveContainer" containerID="ad68806eae3e6f36666b239f6eefd4283bac0a5eecdd1ca5335b2ba8737f926f" Oct 01 12:38:38 crc kubenswrapper[4669]: I1001 12:38:38.840244 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75fdb4d7c7-7ltfb_74d7e57e-eda0-4134-bfd3-ed2c0e4826bf/neutron-httpd/0.log" Oct 01 12:38:39 crc kubenswrapper[4669]: I1001 12:38:39.051333 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lmzmm_09c6e280-6373-44f6-ad9b-fe24fe56e738/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:39 crc kubenswrapper[4669]: I1001 12:38:39.693400 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b39855ee-c66e-4f78-8128-a0149c9431da/nova-api-log/0.log" Oct 01 12:38:39 crc kubenswrapper[4669]: I1001 12:38:39.932169 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_35e646b8-72fe-4762-a24b-a74ddfb6be97/nova-cell0-conductor-conductor/0.log" Oct 01 12:38:40 crc kubenswrapper[4669]: I1001 12:38:40.152591 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b39855ee-c66e-4f78-8128-a0149c9431da/nova-api-api/0.log" Oct 01 12:38:40 crc kubenswrapper[4669]: I1001 12:38:40.335369 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2266ee85-7b31-496a-9dbd-6d69e282e847/nova-cell1-conductor-conductor/0.log" Oct 01 12:38:40 crc kubenswrapper[4669]: I1001 12:38:40.676737 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ef9631f5-92a1-4d2b-a5a6-25b60a609d61/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 12:38:40 crc kubenswrapper[4669]: I1001 12:38:40.826024 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zr89n_da3d07f3-8fb0-4ab3-a350-ad5b2a09af97/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:41 crc kubenswrapper[4669]: I1001 12:38:41.040857 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80be53d5-3338-467a-9be5-779722416d52/nova-metadata-log/0.log" Oct 01 12:38:41 crc kubenswrapper[4669]: I1001 12:38:41.617506 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_25ce3ac8-78ca-445e-acd1-995d99a5757a/nova-scheduler-scheduler/0.log" Oct 01 12:38:41 crc kubenswrapper[4669]: I1001 12:38:41.860570 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92bd05a8-df03-4e85-b32a-dc3ced713159/mysql-bootstrap/0.log" Oct 01 12:38:42 crc kubenswrapper[4669]: I1001 12:38:42.047327 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92bd05a8-df03-4e85-b32a-dc3ced713159/mysql-bootstrap/0.log" Oct 01 12:38:42 crc kubenswrapper[4669]: I1001 12:38:42.089396 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92bd05a8-df03-4e85-b32a-dc3ced713159/galera/0.log" Oct 01 12:38:42 crc kubenswrapper[4669]: I1001 12:38:42.385310 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_872d79b4-0374-4e78-98e4-32393e2f7f05/mysql-bootstrap/0.log" Oct 01 12:38:42 crc kubenswrapper[4669]: I1001 12:38:42.606439 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_872d79b4-0374-4e78-98e4-32393e2f7f05/mysql-bootstrap/0.log" Oct 01 12:38:42 crc kubenswrapper[4669]: I1001 12:38:42.645268 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_872d79b4-0374-4e78-98e4-32393e2f7f05/galera/0.log" Oct 01 12:38:42 crc kubenswrapper[4669]: I1001 12:38:42.867729 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d68adea0-9ec1-4cc3-a727-a64457a70c9b/openstackclient/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.054020 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80be53d5-3338-467a-9be5-779722416d52/nova-metadata-metadata/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.133096 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nsrfk_b77a4c9a-0426-40f6-a28a-7b985aebc4a2/openstack-network-exporter/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.386644 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovsdb-server-init/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.730490 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovsdb-server-init/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.757521 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovsdb-server/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.761525 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d5fz7_1c9e9459-07b3-4f2d-9385-7c41a5bb6edd/ovs-vswitchd/0.log" Oct 01 12:38:43 crc kubenswrapper[4669]: I1001 12:38:43.995100 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-plhdj_c5ffe639-af06-4c4c-8794-a1becff8a692/ovn-controller/0.log" Oct 01 12:38:44 crc kubenswrapper[4669]: I1001 12:38:44.311832 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xmvtv_ffe0bf53-0bbb-45ac-96b3-fa31c365470c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:44 crc kubenswrapper[4669]: I1001 12:38:44.441959 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_83f3ffe1-ac22-408f-ab82-73d5cfd82953/openstack-network-exporter/0.log" Oct 01 12:38:44 crc kubenswrapper[4669]: I1001 12:38:44.624213 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_83f3ffe1-ac22-408f-ab82-73d5cfd82953/ovn-northd/0.log" Oct 01 12:38:44 crc kubenswrapper[4669]: I1001 12:38:44.664235 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76c8bfa8-2fca-4a74-85e8-f44af35d612f/openstack-network-exporter/0.log" Oct 01 12:38:44 crc kubenswrapper[4669]: I1001 12:38:44.871023 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_76c8bfa8-2fca-4a74-85e8-f44af35d612f/ovsdbserver-nb/0.log" Oct 01 12:38:44 crc kubenswrapper[4669]: I1001 12:38:44.915251 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d13ad6e-a577-4f92-95ea-8ad268373774/openstack-network-exporter/0.log" Oct 01 12:38:45 crc kubenswrapper[4669]: I1001 12:38:45.430234 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d13ad6e-a577-4f92-95ea-8ad268373774/ovsdbserver-sb/0.log" Oct 01 12:38:45 crc kubenswrapper[4669]: I1001 12:38:45.647239 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-795f7c5588-ppc46_419df7bd-f554-4888-8a51-e885964ada7e/placement-api/0.log" Oct 01 12:38:45 crc kubenswrapper[4669]: I1001 12:38:45.859687 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-795f7c5588-ppc46_419df7bd-f554-4888-8a51-e885964ada7e/placement-log/0.log" Oct 01 12:38:45 crc kubenswrapper[4669]: I1001 12:38:45.947724 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e/setup-container/0.log" Oct 01 12:38:46 crc kubenswrapper[4669]: I1001 12:38:46.172600 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e/setup-container/0.log" Oct 01 12:38:46 crc kubenswrapper[4669]: I1001 12:38:46.248443 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f1de3ce8-45aa-4a56-9f4a-1fed0a713d2e/rabbitmq/0.log" Oct 01 12:38:46 crc kubenswrapper[4669]: I1001 12:38:46.877041 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_352c2b88-bf96-4858-b166-d5655b36b2b0/setup-container/0.log" Oct 01 12:38:47 crc kubenswrapper[4669]: I1001 12:38:47.084164 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_352c2b88-bf96-4858-b166-d5655b36b2b0/setup-container/0.log" Oct 01 12:38:47 crc kubenswrapper[4669]: I1001 12:38:47.112203 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_352c2b88-bf96-4858-b166-d5655b36b2b0/rabbitmq/0.log" Oct 01 12:38:47 crc kubenswrapper[4669]: I1001 12:38:47.374095 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6xqvf_266686ce-e77a-4c6f-83d3-4d417e9a819f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:47 crc kubenswrapper[4669]: I1001 12:38:47.415796 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-k47v2_a422f4f8-7b2e-4f73-89e8-2659cda6effa/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:47 crc kubenswrapper[4669]: I1001 12:38:47.670405 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-v79z8_3f131ccb-5e9b-4097-8abe-f10d6f2c9b52/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:47 crc kubenswrapper[4669]: I1001 12:38:47.927940 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c4phj_0ffd3326-9422-4f07-b3e1-857324cff3e2/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:48 crc kubenswrapper[4669]: I1001 12:38:48.044520 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-l66p2_7c88952b-368f-4527-8916-b4877e5af1e3/ssh-known-hosts-edpm-deployment/0.log" Oct 01 12:38:48 crc kubenswrapper[4669]: I1001 12:38:48.360165 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c769b8b9-5svbp_fd677364-3064-4b42-9555-b640561fa4ed/proxy-server/0.log" Oct 01 12:38:48 crc kubenswrapper[4669]: I1001 12:38:48.477321 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6c769b8b9-5svbp_fd677364-3064-4b42-9555-b640561fa4ed/proxy-httpd/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.098383 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pw6p8_9c77921b-54a6-48fd-a57c-4c14d17bf7d3/swift-ring-rebalance/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.322850 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-auditor/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.439244 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-reaper/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.498004 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-replicator/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.584030 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/account-server/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.686849 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-auditor/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.758777 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-replicator/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.837879 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-server/0.log" Oct 01 12:38:49 crc kubenswrapper[4669]: I1001 12:38:49.935543 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/container-updater/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.061993 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-auditor/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.115598 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-expirer/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.213844 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-replicator/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.301894 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-server/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.380697 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/object-updater/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.485348 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/rsync/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.595967 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_681d4309-a9a8-4c2c-bf25-4619653187fd/swift-recon-cron/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.855666 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nw9hl_d1966594-3c43-4ecf-a982-fc851d0bb43b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:38:50 crc kubenswrapper[4669]: I1001 12:38:50.896863 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fce73f67-b429-4b4a-b873-a45f92d104c7/tempest-tests-tempest-tests-runner/0.log" Oct 01 12:38:51 crc kubenswrapper[4669]: I1001 12:38:51.078581 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b2e123bd-d4e4-4b23-a8e0-07ea01e2c586/test-operator-logs-container/0.log" Oct 01 12:38:51 crc kubenswrapper[4669]: I1001 12:38:51.326287 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-krxsp_74c54aa8-261e-4bad-babf-2838c6b49114/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 12:39:00 crc kubenswrapper[4669]: I1001 12:39:00.432675 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0dda17c6-d274-4975-8796-deda5fd09e9c/memcached/0.log" Oct 01 12:39:27 crc kubenswrapper[4669]: I1001 12:39:27.154877 4669 generic.go:334] "Generic (PLEG): container finished" podID="83cfee95-2c7d-4802-bfe5-5acf65c89d9d" containerID="b719e7f6d9256d982311a31312976a85a7e2e35eacc733dbf89ada36a1770f6e" exitCode=0 Oct 01 12:39:27 crc kubenswrapper[4669]: I1001 12:39:27.154895 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" event={"ID":"83cfee95-2c7d-4802-bfe5-5acf65c89d9d","Type":"ContainerDied","Data":"b719e7f6d9256d982311a31312976a85a7e2e35eacc733dbf89ada36a1770f6e"} Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.315733 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.363376 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-9qrt9"] Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.375984 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-9qrt9"] Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.418315 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-host\") pod \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.418541 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-host" (OuterVolumeSpecName: "host") pod "83cfee95-2c7d-4802-bfe5-5acf65c89d9d" (UID: "83cfee95-2c7d-4802-bfe5-5acf65c89d9d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.419251 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgtmm\" (UniqueName: \"kubernetes.io/projected/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-kube-api-access-mgtmm\") pod \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\" (UID: \"83cfee95-2c7d-4802-bfe5-5acf65c89d9d\") " Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.420477 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-host\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.430315 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-kube-api-access-mgtmm" (OuterVolumeSpecName: "kube-api-access-mgtmm") pod "83cfee95-2c7d-4802-bfe5-5acf65c89d9d" (UID: "83cfee95-2c7d-4802-bfe5-5acf65c89d9d"). InnerVolumeSpecName "kube-api-access-mgtmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:39:28 crc kubenswrapper[4669]: I1001 12:39:28.523412 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgtmm\" (UniqueName: \"kubernetes.io/projected/83cfee95-2c7d-4802-bfe5-5acf65c89d9d-kube-api-access-mgtmm\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.182463 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e6751b3ca4988d5e669c3f3a525b129f43e7a993082778212d2cb1a42e0f994" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.182543 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-9qrt9" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.574915 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-7rzq5"] Oct 01 12:39:29 crc kubenswrapper[4669]: E1001 12:39:29.575959 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cfee95-2c7d-4802-bfe5-5acf65c89d9d" containerName="container-00" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.575982 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cfee95-2c7d-4802-bfe5-5acf65c89d9d" containerName="container-00" Oct 01 12:39:29 crc kubenswrapper[4669]: E1001 12:39:29.576057 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="extract-utilities" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.576067 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="extract-utilities" Oct 01 12:39:29 crc kubenswrapper[4669]: E1001 12:39:29.576116 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="registry-server" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.576127 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="registry-server" Oct 01 12:39:29 crc kubenswrapper[4669]: E1001 12:39:29.576164 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="extract-content" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.576172 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="extract-content" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.576814 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b13403b-5046-4673-be13-76e3bfef960e" containerName="registry-server" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.577206 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="83cfee95-2c7d-4802-bfe5-5acf65c89d9d" containerName="container-00" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.579671 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.649524 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-host\") pod \"crc-debug-7rzq5\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.649741 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gb2t\" (UniqueName: \"kubernetes.io/projected/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-kube-api-access-9gb2t\") pod \"crc-debug-7rzq5\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.657460 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83cfee95-2c7d-4802-bfe5-5acf65c89d9d" path="/var/lib/kubelet/pods/83cfee95-2c7d-4802-bfe5-5acf65c89d9d/volumes" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.751517 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gb2t\" (UniqueName: \"kubernetes.io/projected/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-kube-api-access-9gb2t\") pod \"crc-debug-7rzq5\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.752376 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-host\") pod \"crc-debug-7rzq5\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.752612 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-host\") pod \"crc-debug-7rzq5\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.778673 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gb2t\" (UniqueName: \"kubernetes.io/projected/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-kube-api-access-9gb2t\") pod \"crc-debug-7rzq5\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:29 crc kubenswrapper[4669]: I1001 12:39:29.908309 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:30 crc kubenswrapper[4669]: I1001 12:39:30.198544 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" event={"ID":"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d","Type":"ContainerStarted","Data":"170a59962c5b18c9223ac17361b65e6c53c7eb74b87602ef621f195bffc78301"} Oct 01 12:39:31 crc kubenswrapper[4669]: I1001 12:39:31.213734 4669 generic.go:334] "Generic (PLEG): container finished" podID="cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" containerID="6e160f928f69eebca6cdd4c556ae0959ab6fd6dc17fb32fa976b6c42dda33280" exitCode=0 Oct 01 12:39:31 crc kubenswrapper[4669]: I1001 12:39:31.213829 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" event={"ID":"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d","Type":"ContainerDied","Data":"6e160f928f69eebca6cdd4c556ae0959ab6fd6dc17fb32fa976b6c42dda33280"} Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.350142 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.523330 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-host\") pod \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.523726 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gb2t\" (UniqueName: \"kubernetes.io/projected/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-kube-api-access-9gb2t\") pod \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\" (UID: \"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d\") " Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.523893 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-host" (OuterVolumeSpecName: "host") pod "cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" (UID: "cf42c42e-ab40-47f3-b8a4-dda7bee3c63d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.524293 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-host\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.543438 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-kube-api-access-9gb2t" (OuterVolumeSpecName: "kube-api-access-9gb2t") pod "cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" (UID: "cf42c42e-ab40-47f3-b8a4-dda7bee3c63d"). InnerVolumeSpecName "kube-api-access-9gb2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:39:32 crc kubenswrapper[4669]: I1001 12:39:32.626005 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gb2t\" (UniqueName: \"kubernetes.io/projected/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d-kube-api-access-9gb2t\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:33 crc kubenswrapper[4669]: I1001 12:39:33.240508 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" event={"ID":"cf42c42e-ab40-47f3-b8a4-dda7bee3c63d","Type":"ContainerDied","Data":"170a59962c5b18c9223ac17361b65e6c53c7eb74b87602ef621f195bffc78301"} Oct 01 12:39:33 crc kubenswrapper[4669]: I1001 12:39:33.240576 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170a59962c5b18c9223ac17361b65e6c53c7eb74b87602ef621f195bffc78301" Oct 01 12:39:33 crc kubenswrapper[4669]: I1001 12:39:33.240629 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-7rzq5" Oct 01 12:39:39 crc kubenswrapper[4669]: I1001 12:39:39.610379 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-7rzq5"] Oct 01 12:39:39 crc kubenswrapper[4669]: I1001 12:39:39.623801 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-7rzq5"] Oct 01 12:39:39 crc kubenswrapper[4669]: I1001 12:39:39.657471 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" path="/var/lib/kubelet/pods/cf42c42e-ab40-47f3-b8a4-dda7bee3c63d/volumes" Oct 01 12:39:40 crc kubenswrapper[4669]: I1001 12:39:40.884011 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-k4n7n"] Oct 01 12:39:40 crc kubenswrapper[4669]: E1001 12:39:40.885351 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" containerName="container-00" Oct 01 12:39:40 crc kubenswrapper[4669]: I1001 12:39:40.885375 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" containerName="container-00" Oct 01 12:39:40 crc kubenswrapper[4669]: I1001 12:39:40.885725 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf42c42e-ab40-47f3-b8a4-dda7bee3c63d" containerName="container-00" Oct 01 12:39:40 crc kubenswrapper[4669]: I1001 12:39:40.886950 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.017234 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pxt\" (UniqueName: \"kubernetes.io/projected/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-kube-api-access-p5pxt\") pod \"crc-debug-k4n7n\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.017485 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-host\") pod \"crc-debug-k4n7n\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.120531 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pxt\" (UniqueName: \"kubernetes.io/projected/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-kube-api-access-p5pxt\") pod \"crc-debug-k4n7n\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.120701 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-host\") pod \"crc-debug-k4n7n\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.120896 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-host\") pod \"crc-debug-k4n7n\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.440528 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pxt\" (UniqueName: \"kubernetes.io/projected/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-kube-api-access-p5pxt\") pod \"crc-debug-k4n7n\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:41 crc kubenswrapper[4669]: I1001 12:39:41.507443 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:42 crc kubenswrapper[4669]: I1001 12:39:42.350859 4669 generic.go:334] "Generic (PLEG): container finished" podID="ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" containerID="2036308997a269ca62cc4853ef6ea3769591c3813df9bf38f46b547a189b2432" exitCode=0 Oct 01 12:39:42 crc kubenswrapper[4669]: I1001 12:39:42.350949 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" event={"ID":"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15","Type":"ContainerDied","Data":"2036308997a269ca62cc4853ef6ea3769591c3813df9bf38f46b547a189b2432"} Oct 01 12:39:42 crc kubenswrapper[4669]: I1001 12:39:42.351319 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" event={"ID":"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15","Type":"ContainerStarted","Data":"27d3cf9f6adeef5008492310729efad3335949ba8778aacab391208eb1610351"} Oct 01 12:39:42 crc kubenswrapper[4669]: I1001 12:39:42.416395 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-k4n7n"] Oct 01 12:39:42 crc kubenswrapper[4669]: I1001 12:39:42.430624 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hxvt9/crc-debug-k4n7n"] Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.491665 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.577242 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5pxt\" (UniqueName: \"kubernetes.io/projected/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-kube-api-access-p5pxt\") pod \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.577408 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-host\") pod \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\" (UID: \"ea1062b7-a87d-4d90-b5f0-4bf4c6709b15\") " Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.577554 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-host" (OuterVolumeSpecName: "host") pod "ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" (UID: "ea1062b7-a87d-4d90-b5f0-4bf4c6709b15"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.578155 4669 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-host\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.585305 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-kube-api-access-p5pxt" (OuterVolumeSpecName: "kube-api-access-p5pxt") pod "ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" (UID: "ea1062b7-a87d-4d90-b5f0-4bf4c6709b15"). InnerVolumeSpecName "kube-api-access-p5pxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.657243 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" path="/var/lib/kubelet/pods/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15/volumes" Oct 01 12:39:43 crc kubenswrapper[4669]: I1001 12:39:43.680518 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5pxt\" (UniqueName: \"kubernetes.io/projected/ea1062b7-a87d-4d90-b5f0-4bf4c6709b15-kube-api-access-p5pxt\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.119937 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/util/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.338744 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/pull/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.372827 4669 scope.go:117] "RemoveContainer" containerID="2036308997a269ca62cc4853ef6ea3769591c3813df9bf38f46b547a189b2432" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.372906 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/crc-debug-k4n7n" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.390012 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/util/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.390621 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/pull/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.610292 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/pull/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.665949 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/util/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.676932 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7162260331a86559b293dc3f3fffe3804c1a3cba4dea9d09afdd27bd8fhsp8t_4920edbb-5c89-4081-821f-5b7fcaa1bf9c/extract/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.844976 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-9m4xq_aba4ff11-8110-4490-8a20-74c454be55d8/kube-rbac-proxy/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.938572 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-cz2dp_7f0d56cb-1002-4345-903e-7e5979f47978/kube-rbac-proxy/0.log" Oct 01 12:39:44 crc kubenswrapper[4669]: I1001 12:39:44.964633 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-9m4xq_aba4ff11-8110-4490-8a20-74c454be55d8/manager/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.123596 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-cz2dp_7f0d56cb-1002-4345-903e-7e5979f47978/manager/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.178773 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xwt57_4fa32a0a-904e-4b37-8ffb-a8c1d89df689/kube-rbac-proxy/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.239615 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xwt57_4fa32a0a-904e-4b37-8ffb-a8c1d89df689/manager/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.384653 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-dhqk6_e8163ded-d297-43ea-bde7-b5b90bdf1d17/kube-rbac-proxy/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.505530 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-dhqk6_e8163ded-d297-43ea-bde7-b5b90bdf1d17/manager/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.575228 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-2z84q_ebc9c519-e267-43d1-93b7-4cf38c84cc66/kube-rbac-proxy/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.635198 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-2z84q_ebc9c519-e267-43d1-93b7-4cf38c84cc66/manager/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.753776 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-5fxfq_1265856e-7658-44ca-b0a9-a0a5a42b8f5d/kube-rbac-proxy/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.846977 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-5fxfq_1265856e-7658-44ca-b0a9-a0a5a42b8f5d/manager/0.log" Oct 01 12:39:45 crc kubenswrapper[4669]: I1001 12:39:45.981438 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-dv8s2_a887d629-1025-4da7-8c68-4b17c7205479/kube-rbac-proxy/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.133587 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-lbq2b_863b3375-804f-4c8b-ba14-01230d822604/kube-rbac-proxy/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.200865 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-dv8s2_a887d629-1025-4da7-8c68-4b17c7205479/manager/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.256299 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-lbq2b_863b3375-804f-4c8b-ba14-01230d822604/manager/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.373581 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-j8t6g_400a027c-2dab-48e5-a109-e7b64d35807a/kube-rbac-proxy/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.513389 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-j8t6g_400a027c-2dab-48e5-a109-e7b64d35807a/manager/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.569232 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-djnfk_99c2ea9b-bcc7-4933-9614-94c32861e93c/kube-rbac-proxy/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.627706 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-djnfk_99c2ea9b-bcc7-4933-9614-94c32861e93c/manager/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.747507 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-fssdp_08897606-8ccd-4508-bf20-501855920e9e/kube-rbac-proxy/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.808458 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-fssdp_08897606-8ccd-4508-bf20-501855920e9e/manager/0.log" Oct 01 12:39:46 crc kubenswrapper[4669]: I1001 12:39:46.919660 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-86p66_da724701-02fc-439b-ba86-52bde8cb3003/kube-rbac-proxy/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.032112 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-86p66_da724701-02fc-439b-ba86-52bde8cb3003/manager/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.186503 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-7qf7f_8783a088-91c6-4f3c-bc34-b3d5a805ea07/kube-rbac-proxy/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.200476 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-7qf7f_8783a088-91c6-4f3c-bc34-b3d5a805ea07/manager/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.291498 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-nr258_707270cf-007e-4572-bae9-dd6b4c6e50d3/kube-rbac-proxy/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.473491 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-nr258_707270cf-007e-4572-bae9-dd6b4c6e50d3/manager/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.585845 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cfvgks_91df1fb9-8c91-4dde-9317-ff09df368c49/kube-rbac-proxy/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.594307 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cfvgks_91df1fb9-8c91-4dde-9317-ff09df368c49/manager/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.706292 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6599487588-n9gx7_964d3ab1-839a-49e6-b7c8-46056b070131/kube-rbac-proxy/0.log" Oct 01 12:39:47 crc kubenswrapper[4669]: I1001 12:39:47.849743 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76995989df-q6c9d_7fe95054-218b-47f4-a729-95a7f6b45a3d/kube-rbac-proxy/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.055192 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76995989df-q6c9d_7fe95054-218b-47f4-a729-95a7f6b45a3d/operator/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.082956 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fmzck_684a045b-062b-4989-85cf-f621d5c88f39/registry-server/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.296401 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-5mbbk_621748e9-0765-432f-bbc9-9bb62594eff6/kube-rbac-proxy/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.376315 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-5mbbk_621748e9-0765-432f-bbc9-9bb62594eff6/manager/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.513127 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bc8gx_4f573e37-cb0a-4eba-9477-7c3d71276c86/kube-rbac-proxy/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.564830 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bc8gx_4f573e37-cb0a-4eba-9477-7c3d71276c86/manager/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.660811 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-szrsc_2545705c-a102-47ca-b42b-119670c5be57/operator/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.837758 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-2c2qw_e24ede8f-da24-4161-8621-d8b5abd08c1f/kube-rbac-proxy/0.log" Oct 01 12:39:48 crc kubenswrapper[4669]: I1001 12:39:48.861254 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-2c2qw_e24ede8f-da24-4161-8621-d8b5abd08c1f/manager/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.016141 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6599487588-n9gx7_964d3ab1-839a-49e6-b7c8-46056b070131/manager/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.060414 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-p5qll_a2282a94-4700-4aae-8572-2104962decf8/kube-rbac-proxy/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.205842 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-p5qll_a2282a94-4700-4aae-8572-2104962decf8/manager/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.240888 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vxwz5_fb18dab5-d638-443a-bb62-6508de79bc0f/kube-rbac-proxy/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.290635 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vxwz5_fb18dab5-d638-443a-bb62-6508de79bc0f/manager/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.356772 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-lgckz_6d2b6087-c54d-4138-b162-e024a7a0e842/kube-rbac-proxy/0.log" Oct 01 12:39:49 crc kubenswrapper[4669]: I1001 12:39:49.436553 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-lgckz_6d2b6087-c54d-4138-b162-e024a7a0e842/manager/0.log" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.787210 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbw9r"] Oct 01 12:39:53 crc kubenswrapper[4669]: E1001 12:39:53.788529 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" containerName="container-00" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.788546 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" containerName="container-00" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.788760 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1062b7-a87d-4d90-b5f0-4bf4c6709b15" containerName="container-00" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.793770 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.812786 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbw9r"] Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.914635 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-catalog-content\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.914790 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-utilities\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:53 crc kubenswrapper[4669]: I1001 12:39:53.915251 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dkl\" (UniqueName: \"kubernetes.io/projected/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-kube-api-access-57dkl\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.018215 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-utilities\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.018308 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dkl\" (UniqueName: \"kubernetes.io/projected/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-kube-api-access-57dkl\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.018408 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-catalog-content\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.018926 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-utilities\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.018972 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-catalog-content\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.043581 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dkl\" (UniqueName: \"kubernetes.io/projected/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-kube-api-access-57dkl\") pod \"redhat-operators-fbw9r\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.167732 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:39:54 crc kubenswrapper[4669]: I1001 12:39:54.702349 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbw9r"] Oct 01 12:39:55 crc kubenswrapper[4669]: I1001 12:39:55.485002 4669 generic.go:334] "Generic (PLEG): container finished" podID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerID="59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf" exitCode=0 Oct 01 12:39:55 crc kubenswrapper[4669]: I1001 12:39:55.485765 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbw9r" event={"ID":"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c","Type":"ContainerDied","Data":"59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf"} Oct 01 12:39:55 crc kubenswrapper[4669]: I1001 12:39:55.485798 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbw9r" event={"ID":"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c","Type":"ContainerStarted","Data":"bbfe484117b2001df28c937a798b4d739416551ba8f96f1d9bf2804837a541b9"} Oct 01 12:39:57 crc kubenswrapper[4669]: I1001 12:39:57.514307 4669 generic.go:334] "Generic (PLEG): container finished" podID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerID="8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df" exitCode=0 Oct 01 12:39:57 crc kubenswrapper[4669]: I1001 12:39:57.514442 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbw9r" event={"ID":"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c","Type":"ContainerDied","Data":"8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df"} Oct 01 12:39:58 crc kubenswrapper[4669]: I1001 12:39:58.526869 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbw9r" event={"ID":"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c","Type":"ContainerStarted","Data":"d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7"} Oct 01 12:39:58 crc kubenswrapper[4669]: I1001 12:39:58.555377 4669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbw9r" podStartSLOduration=3.11275705 podStartE2EDuration="5.555354888s" podCreationTimestamp="2025-10-01 12:39:53 +0000 UTC" firstStartedPulling="2025-10-01 12:39:55.488742986 +0000 UTC m=+4286.588307963" lastFinishedPulling="2025-10-01 12:39:57.931340824 +0000 UTC m=+4289.030905801" observedRunningTime="2025-10-01 12:39:58.54691197 +0000 UTC m=+4289.646476947" watchObservedRunningTime="2025-10-01 12:39:58.555354888 +0000 UTC m=+4289.654919855" Oct 01 12:40:01 crc kubenswrapper[4669]: I1001 12:40:01.863645 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:40:01 crc kubenswrapper[4669]: I1001 12:40:01.864523 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:40:04 crc kubenswrapper[4669]: I1001 12:40:04.168265 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:40:04 crc kubenswrapper[4669]: I1001 12:40:04.168773 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:40:04 crc kubenswrapper[4669]: I1001 12:40:04.230034 4669 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:40:05 crc kubenswrapper[4669]: I1001 12:40:05.196364 4669 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:40:05 crc kubenswrapper[4669]: I1001 12:40:05.263465 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbw9r"] Oct 01 12:40:06 crc kubenswrapper[4669]: I1001 12:40:06.619317 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbw9r" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="registry-server" containerID="cri-o://d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7" gracePeriod=2 Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.085949 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.265283 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-utilities\") pod \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.265874 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-catalog-content\") pod \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.265949 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dkl\" (UniqueName: \"kubernetes.io/projected/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-kube-api-access-57dkl\") pod \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\" (UID: \"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c\") " Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.267051 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-utilities" (OuterVolumeSpecName: "utilities") pod "5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" (UID: "5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.370407 4669 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.635425 4669 generic.go:334] "Generic (PLEG): container finished" podID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerID="d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7" exitCode=0 Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.635499 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbw9r" event={"ID":"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c","Type":"ContainerDied","Data":"d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7"} Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.635529 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbw9r" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.635558 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbw9r" event={"ID":"5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c","Type":"ContainerDied","Data":"bbfe484117b2001df28c937a798b4d739416551ba8f96f1d9bf2804837a541b9"} Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.635594 4669 scope.go:117] "RemoveContainer" containerID="d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.676025 4669 scope.go:117] "RemoveContainer" containerID="8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.733325 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-kube-api-access-57dkl" (OuterVolumeSpecName: "kube-api-access-57dkl") pod "5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" (UID: "5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c"). InnerVolumeSpecName "kube-api-access-57dkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.761198 4669 scope.go:117] "RemoveContainer" containerID="59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.780380 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dkl\" (UniqueName: \"kubernetes.io/projected/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-kube-api-access-57dkl\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.830208 4669 scope.go:117] "RemoveContainer" containerID="d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7" Oct 01 12:40:07 crc kubenswrapper[4669]: E1001 12:40:07.830814 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7\": container with ID starting with d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7 not found: ID does not exist" containerID="d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.830863 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7"} err="failed to get container status \"d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7\": rpc error: code = NotFound desc = could not find container \"d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7\": container with ID starting with d29f696c434bce3ea27f464d82ed8491c6ae72ccacafbde34dcd18e82b0f98d7 not found: ID does not exist" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.830887 4669 scope.go:117] "RemoveContainer" containerID="8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df" Oct 01 12:40:07 crc kubenswrapper[4669]: E1001 12:40:07.831884 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df\": container with ID starting with 8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df not found: ID does not exist" containerID="8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.831912 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df"} err="failed to get container status \"8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df\": rpc error: code = NotFound desc = could not find container \"8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df\": container with ID starting with 8b8e53e0c1405ab670d717684b1d5f508024c23dbbe76f1073313d731827b8df not found: ID does not exist" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.831926 4669 scope.go:117] "RemoveContainer" containerID="59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf" Oct 01 12:40:07 crc kubenswrapper[4669]: E1001 12:40:07.832232 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf\": container with ID starting with 59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf not found: ID does not exist" containerID="59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf" Oct 01 12:40:07 crc kubenswrapper[4669]: I1001 12:40:07.832252 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf"} err="failed to get container status \"59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf\": rpc error: code = NotFound desc = could not find container \"59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf\": container with ID starting with 59c2bcf34f51261175465dbde06bc20214fe2ce8f3677cf5e49d010271a21caf not found: ID does not exist" Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.011458 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" (UID: "5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.030929 4669 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.171125 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbw9r"] Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.183942 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbw9r"] Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.488223 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jfjm9_83f83da4-e855-4070-b524-4b7b789d0215/control-plane-machine-set-operator/0.log" Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.668792 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" path="/var/lib/kubelet/pods/5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c/volumes" Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.683713 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7q7n5_e7445657-b8e4-4974-a680-7a05f0628fb7/kube-rbac-proxy/0.log" Oct 01 12:40:09 crc kubenswrapper[4669]: I1001 12:40:09.718056 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7q7n5_e7445657-b8e4-4974-a680-7a05f0628fb7/machine-api-operator/0.log" Oct 01 12:40:23 crc kubenswrapper[4669]: I1001 12:40:23.104882 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vczt8_2c0929fd-88f7-47d4-9975-54d4d6c606c0/cert-manager-controller/0.log" Oct 01 12:40:23 crc kubenswrapper[4669]: I1001 12:40:23.292238 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-j5v46_bb59959e-c15d-466f-8809-66c2ae4c8a0b/cert-manager-webhook/0.log" Oct 01 12:40:23 crc kubenswrapper[4669]: I1001 12:40:23.306068 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6kfl9_8f212951-fc37-4759-8933-2cee5f94845e/cert-manager-cainjector/0.log" Oct 01 12:40:31 crc kubenswrapper[4669]: I1001 12:40:31.863475 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:40:31 crc kubenswrapper[4669]: I1001 12:40:31.864441 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:40:36 crc kubenswrapper[4669]: I1001 12:40:36.803522 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-wlrv2_39594755-e0c6-4941-ac5c-b847a32459ff/nmstate-console-plugin/0.log" Oct 01 12:40:36 crc kubenswrapper[4669]: I1001 12:40:36.990049 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8p9bl_a2c1c01f-82d8-48e3-a140-14f363594918/nmstate-handler/0.log" Oct 01 12:40:37 crc kubenswrapper[4669]: I1001 12:40:37.057102 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-87fqs_10471e2d-ad87-44b7-af2e-b2209ae9337e/nmstate-metrics/0.log" Oct 01 12:40:37 crc kubenswrapper[4669]: I1001 12:40:37.061481 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-87fqs_10471e2d-ad87-44b7-af2e-b2209ae9337e/kube-rbac-proxy/0.log" Oct 01 12:40:37 crc kubenswrapper[4669]: I1001 12:40:37.233438 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-t4r7r_d776bb0e-3c68-4273-8aa2-e17ce4299e0c/nmstate-operator/0.log" Oct 01 12:40:37 crc kubenswrapper[4669]: I1001 12:40:37.304565 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-vbw79_4a99a9fe-0aaa-496b-97f2-e0964378b735/nmstate-webhook/0.log" Oct 01 12:40:52 crc kubenswrapper[4669]: I1001 12:40:52.462158 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-8kstm_05969ea4-e97c-4b66-aa70-c4909a58472b/kube-rbac-proxy/0.log" Oct 01 12:40:52 crc kubenswrapper[4669]: I1001 12:40:52.671903 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-8kstm_05969ea4-e97c-4b66-aa70-c4909a58472b/controller/0.log" Oct 01 12:40:52 crc kubenswrapper[4669]: I1001 12:40:52.751104 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:40:52 crc kubenswrapper[4669]: I1001 12:40:52.964550 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:40:52 crc kubenswrapper[4669]: I1001 12:40:52.988115 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:40:52 crc kubenswrapper[4669]: I1001 12:40:52.997967 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.021376 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.213426 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.250780 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.254149 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.263194 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.428582 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-frr-files/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.472940 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-metrics/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.474731 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/cp-reloader/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.476930 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/controller/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.684257 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/kube-rbac-proxy/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.705236 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/frr-metrics/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.705947 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/kube-rbac-proxy-frr/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.949766 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/reloader/0.log" Oct 01 12:40:53 crc kubenswrapper[4669]: I1001 12:40:53.956731 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-pnqmb_b38f6785-4644-476d-9014-3ad44957a952/frr-k8s-webhook-server/0.log" Oct 01 12:40:54 crc kubenswrapper[4669]: I1001 12:40:54.183917 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6774cc6d74-d656r_562d1f16-7779-4cfb-ae80-5bad719475d1/manager/0.log" Oct 01 12:40:55 crc kubenswrapper[4669]: I1001 12:40:55.236517 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c796f5894-wqh8w_c8793365-44bd-4d00-aa95-2d23bd134f23/webhook-server/0.log" Oct 01 12:40:55 crc kubenswrapper[4669]: I1001 12:40:55.254162 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t6f9w_fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230/kube-rbac-proxy/0.log" Oct 01 12:40:55 crc kubenswrapper[4669]: I1001 12:40:55.364452 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4lq96_204f5c6d-d71c-4ab6-bfc8-a8682b4e997b/frr/0.log" Oct 01 12:40:55 crc kubenswrapper[4669]: I1001 12:40:55.756650 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t6f9w_fa7d4df4-d56f-4e1c-91f2-cfb2edb9e230/speaker/0.log" Oct 01 12:41:01 crc kubenswrapper[4669]: I1001 12:41:01.864487 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:41:01 crc kubenswrapper[4669]: I1001 12:41:01.864874 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:41:01 crc kubenswrapper[4669]: I1001 12:41:01.864934 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:41:01 crc kubenswrapper[4669]: I1001 12:41:01.866009 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"229df817fe94baa1aade7478bd01c70efd0a8f5ad4457de01e0b88aee5ac9fa9"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:41:01 crc kubenswrapper[4669]: I1001 12:41:01.866092 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://229df817fe94baa1aade7478bd01c70efd0a8f5ad4457de01e0b88aee5ac9fa9" gracePeriod=600 Oct 01 12:41:02 crc kubenswrapper[4669]: I1001 12:41:02.265833 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="229df817fe94baa1aade7478bd01c70efd0a8f5ad4457de01e0b88aee5ac9fa9" exitCode=0 Oct 01 12:41:02 crc kubenswrapper[4669]: I1001 12:41:02.265909 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"229df817fe94baa1aade7478bd01c70efd0a8f5ad4457de01e0b88aee5ac9fa9"} Oct 01 12:41:02 crc kubenswrapper[4669]: I1001 12:41:02.266479 4669 scope.go:117] "RemoveContainer" containerID="33eaec9cbbfb06d2cc50ffb17af40cf636ae2834c3fd057ea421df3c19e4e78c" Oct 01 12:41:03 crc kubenswrapper[4669]: I1001 12:41:03.282811 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerStarted","Data":"a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb"} Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.211751 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/util/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.469587 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/pull/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.487783 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/util/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.515995 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/pull/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.737404 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/util/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.752342 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/pull/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.754400 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bccqkt9_57b1aea1-6b22-4512-b88f-bafc19415c87/extract/0.log" Oct 01 12:41:11 crc kubenswrapper[4669]: I1001 12:41:11.973800 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-utilities/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.143566 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-utilities/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.162825 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-content/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.217993 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-content/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.418577 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-content/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.445629 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/extract-utilities/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.719710 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-utilities/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.885424 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-content/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.897169 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f2vsx_2e4864c1-9d72-45e1-a602-fe0a6687811c/registry-server/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.951823 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-content/0.log" Oct 01 12:41:12 crc kubenswrapper[4669]: I1001 12:41:12.952745 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-utilities/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.147934 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-utilities/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.206655 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/extract-content/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.421495 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/util/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.642963 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/pull/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.694491 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/util/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.698844 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/pull/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.927800 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/util/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.928818 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/pull/0.log" Oct 01 12:41:13 crc kubenswrapper[4669]: I1001 12:41:13.995842 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntnfv_a5df8eb3-5517-4e0c-af77-565bddc9fe52/registry-server/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.031012 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9668kpq_47f233f9-29d5-4aaa-b9d5-5514aaf44d14/extract/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.242325 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-utilities/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.281222 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hg5t5_7693f22a-6758-4b18-8161-c5eb5e27a395/marketplace-operator/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.434413 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-utilities/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.487428 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-content/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.495293 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-content/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.748239 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-utilities/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.760016 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/extract-content/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.848882 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttz57_9eab31d8-034e-464c-a5c8-f24b4dcbccb7/registry-server/0.log" Oct 01 12:41:14 crc kubenswrapper[4669]: I1001 12:41:14.954589 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-utilities/0.log" Oct 01 12:41:15 crc kubenswrapper[4669]: I1001 12:41:15.160589 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-content/0.log" Oct 01 12:41:15 crc kubenswrapper[4669]: I1001 12:41:15.162499 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-utilities/0.log" Oct 01 12:41:15 crc kubenswrapper[4669]: I1001 12:41:15.170098 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-content/0.log" Oct 01 12:41:15 crc kubenswrapper[4669]: I1001 12:41:15.334815 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-utilities/0.log" Oct 01 12:41:15 crc kubenswrapper[4669]: I1001 12:41:15.404443 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/extract-content/0.log" Oct 01 12:41:15 crc kubenswrapper[4669]: I1001 12:41:15.997397 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9gtjj_1d950283-1340-49ba-8ddb-35326c3f375e/registry-server/0.log" Oct 01 12:41:54 crc kubenswrapper[4669]: E1001 12:41:54.946955 4669 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:47202->38.102.83.82:43797: write tcp 38.102.83.82:47202->38.102.83.82:43797: write: connection reset by peer Oct 01 12:43:31 crc kubenswrapper[4669]: I1001 12:43:31.863572 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:43:31 crc kubenswrapper[4669]: I1001 12:43:31.864409 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:43:35 crc kubenswrapper[4669]: I1001 12:43:35.070820 4669 generic.go:334] "Generic (PLEG): container finished" podID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerID="7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d" exitCode=0 Oct 01 12:43:35 crc kubenswrapper[4669]: I1001 12:43:35.070945 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" event={"ID":"0a238c08-a2bf-432a-967c-79e1b4dcbfa6","Type":"ContainerDied","Data":"7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d"} Oct 01 12:43:35 crc kubenswrapper[4669]: I1001 12:43:35.073237 4669 scope.go:117] "RemoveContainer" containerID="7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d" Oct 01 12:43:35 crc kubenswrapper[4669]: I1001 12:43:35.542441 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hxvt9_must-gather-ps8sr_0a238c08-a2bf-432a-967c-79e1b4dcbfa6/gather/0.log" Oct 01 12:43:39 crc kubenswrapper[4669]: I1001 12:43:39.106744 4669 scope.go:117] "RemoveContainer" containerID="b719e7f6d9256d982311a31312976a85a7e2e35eacc733dbf89ada36a1770f6e" Oct 01 12:43:48 crc kubenswrapper[4669]: I1001 12:43:48.405890 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hxvt9/must-gather-ps8sr"] Oct 01 12:43:48 crc kubenswrapper[4669]: I1001 12:43:48.406782 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="copy" containerID="cri-o://6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80" gracePeriod=2 Oct 01 12:43:48 crc kubenswrapper[4669]: I1001 12:43:48.415617 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hxvt9/must-gather-ps8sr"] Oct 01 12:43:48 crc kubenswrapper[4669]: I1001 12:43:48.926997 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hxvt9_must-gather-ps8sr_0a238c08-a2bf-432a-967c-79e1b4dcbfa6/copy/0.log" Oct 01 12:43:48 crc kubenswrapper[4669]: I1001 12:43:48.927834 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.073768 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4dx6\" (UniqueName: \"kubernetes.io/projected/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-kube-api-access-n4dx6\") pod \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.074482 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-must-gather-output\") pod \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\" (UID: \"0a238c08-a2bf-432a-967c-79e1b4dcbfa6\") " Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.083574 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-kube-api-access-n4dx6" (OuterVolumeSpecName: "kube-api-access-n4dx6") pod "0a238c08-a2bf-432a-967c-79e1b4dcbfa6" (UID: "0a238c08-a2bf-432a-967c-79e1b4dcbfa6"). InnerVolumeSpecName "kube-api-access-n4dx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.177609 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4dx6\" (UniqueName: \"kubernetes.io/projected/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-kube-api-access-n4dx6\") on node \"crc\" DevicePath \"\"" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.252213 4669 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hxvt9_must-gather-ps8sr_0a238c08-a2bf-432a-967c-79e1b4dcbfa6/copy/0.log" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.252557 4669 generic.go:334] "Generic (PLEG): container finished" podID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerID="6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80" exitCode=143 Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.252617 4669 scope.go:117] "RemoveContainer" containerID="6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.252767 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hxvt9/must-gather-ps8sr" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.288389 4669 scope.go:117] "RemoveContainer" containerID="7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.298196 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0a238c08-a2bf-432a-967c-79e1b4dcbfa6" (UID: "0a238c08-a2bf-432a-967c-79e1b4dcbfa6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.359594 4669 scope.go:117] "RemoveContainer" containerID="6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80" Oct 01 12:43:49 crc kubenswrapper[4669]: E1001 12:43:49.360186 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80\": container with ID starting with 6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80 not found: ID does not exist" containerID="6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.360240 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80"} err="failed to get container status \"6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80\": rpc error: code = NotFound desc = could not find container \"6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80\": container with ID starting with 6d5866eb5ec608c8b09c4039547e0d1b57510abadc99fe6cde65a7e661ab4e80 not found: ID does not exist" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.360276 4669 scope.go:117] "RemoveContainer" containerID="7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d" Oct 01 12:43:49 crc kubenswrapper[4669]: E1001 12:43:49.360698 4669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d\": container with ID starting with 7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d not found: ID does not exist" containerID="7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.360736 4669 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d"} err="failed to get container status \"7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d\": rpc error: code = NotFound desc = could not find container \"7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d\": container with ID starting with 7bb50462470106492e5d1a2a1f0d4c488364dd7fa291120695b16337b1de000d not found: ID does not exist" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.381504 4669 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a238c08-a2bf-432a-967c-79e1b4dcbfa6-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 12:43:49 crc kubenswrapper[4669]: I1001 12:43:49.664651 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" path="/var/lib/kubelet/pods/0a238c08-a2bf-432a-967c-79e1b4dcbfa6/volumes" Oct 01 12:44:01 crc kubenswrapper[4669]: I1001 12:44:01.864283 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:44:01 crc kubenswrapper[4669]: I1001 12:44:01.865342 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:44:31 crc kubenswrapper[4669]: I1001 12:44:31.863424 4669 patch_prober.go:28] interesting pod/machine-config-daemon-5rfqz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:44:31 crc kubenswrapper[4669]: I1001 12:44:31.864953 4669 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:44:31 crc kubenswrapper[4669]: I1001 12:44:31.865028 4669 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" Oct 01 12:44:31 crc kubenswrapper[4669]: I1001 12:44:31.871403 4669 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb"} pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:44:31 crc kubenswrapper[4669]: I1001 12:44:31.871794 4669 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerName="machine-config-daemon" containerID="cri-o://a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" gracePeriod=600 Oct 01 12:44:31 crc kubenswrapper[4669]: E1001 12:44:31.996982 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:44:32 crc kubenswrapper[4669]: I1001 12:44:32.781482 4669 generic.go:334] "Generic (PLEG): container finished" podID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" exitCode=0 Oct 01 12:44:32 crc kubenswrapper[4669]: I1001 12:44:32.781525 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" event={"ID":"a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2","Type":"ContainerDied","Data":"a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb"} Oct 01 12:44:32 crc kubenswrapper[4669]: I1001 12:44:32.781954 4669 scope.go:117] "RemoveContainer" containerID="229df817fe94baa1aade7478bd01c70efd0a8f5ad4457de01e0b88aee5ac9fa9" Oct 01 12:44:32 crc kubenswrapper[4669]: I1001 12:44:32.783041 4669 scope.go:117] "RemoveContainer" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" Oct 01 12:44:32 crc kubenswrapper[4669]: E1001 12:44:32.783793 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:44:39 crc kubenswrapper[4669]: I1001 12:44:39.301482 4669 scope.go:117] "RemoveContainer" containerID="fa76ed85b865e5b195d5e0bedb673fe02229a02280ffe8411ccf22dd46396f29" Oct 01 12:44:39 crc kubenswrapper[4669]: I1001 12:44:39.336191 4669 scope.go:117] "RemoveContainer" containerID="abf4df06cce6e3eae006b0c189a004b3769e9035bb84d0e08e493f0dab64fe20" Oct 01 12:44:39 crc kubenswrapper[4669]: I1001 12:44:39.431170 4669 scope.go:117] "RemoveContainer" containerID="b209e7193df754c248524b0ea0a362565649087e86334d04b9f9b2a875321d42" Oct 01 12:44:45 crc kubenswrapper[4669]: I1001 12:44:45.646142 4669 scope.go:117] "RemoveContainer" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" Oct 01 12:44:45 crc kubenswrapper[4669]: E1001 12:44:45.647724 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:44:59 crc kubenswrapper[4669]: I1001 12:44:59.659871 4669 scope.go:117] "RemoveContainer" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" Oct 01 12:44:59 crc kubenswrapper[4669]: E1001 12:44:59.660937 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.193180 4669 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq"] Oct 01 12:45:00 crc kubenswrapper[4669]: E1001 12:45:00.194373 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="extract-utilities" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.194409 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="extract-utilities" Oct 01 12:45:00 crc kubenswrapper[4669]: E1001 12:45:00.194457 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="extract-content" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.194473 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="extract-content" Oct 01 12:45:00 crc kubenswrapper[4669]: E1001 12:45:00.194496 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="registry-server" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.194510 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="registry-server" Oct 01 12:45:00 crc kubenswrapper[4669]: E1001 12:45:00.194535 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="copy" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.194548 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="copy" Oct 01 12:45:00 crc kubenswrapper[4669]: E1001 12:45:00.194603 4669 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="gather" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.194617 4669 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="gather" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.195009 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="gather" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.195044 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aa88f6c-9c81-4fa6-b2d8-b9fd2982bd6c" containerName="registry-server" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.195102 4669 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a238c08-a2bf-432a-967c-79e1b4dcbfa6" containerName="copy" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.196374 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.199278 4669 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.199453 4669 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.202532 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq"] Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.307778 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-secret-volume\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.308133 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-config-volume\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.311346 4669 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgt86\" (UniqueName: \"kubernetes.io/projected/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-kube-api-access-dgt86\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.414423 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-config-volume\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.414686 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgt86\" (UniqueName: \"kubernetes.io/projected/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-kube-api-access-dgt86\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.414806 4669 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-secret-volume\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.416033 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-config-volume\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.432274 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-secret-volume\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.439341 4669 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgt86\" (UniqueName: \"kubernetes.io/projected/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-kube-api-access-dgt86\") pod \"collect-profiles-29322045-pw9pq\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:00 crc kubenswrapper[4669]: I1001 12:45:00.516231 4669 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:01 crc kubenswrapper[4669]: I1001 12:45:01.049413 4669 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq"] Oct 01 12:45:01 crc kubenswrapper[4669]: I1001 12:45:01.160550 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" event={"ID":"e54c5442-b76a-49d1-b2a6-658ea64dcdf1","Type":"ContainerStarted","Data":"5e67386140179920e41a1513f9e53775a5e62a0c1d4ac219c92e82b46f0d6635"} Oct 01 12:45:02 crc kubenswrapper[4669]: I1001 12:45:02.176186 4669 generic.go:334] "Generic (PLEG): container finished" podID="e54c5442-b76a-49d1-b2a6-658ea64dcdf1" containerID="d4a24131f12f7adffc1db82c4ae413da9e20e089d44fc89e97bc7223e8b5694e" exitCode=0 Oct 01 12:45:02 crc kubenswrapper[4669]: I1001 12:45:02.176271 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" event={"ID":"e54c5442-b76a-49d1-b2a6-658ea64dcdf1","Type":"ContainerDied","Data":"d4a24131f12f7adffc1db82c4ae413da9e20e089d44fc89e97bc7223e8b5694e"} Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.572771 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.700688 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-secret-volume\") pod \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.700790 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-config-volume\") pod \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.700902 4669 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgt86\" (UniqueName: \"kubernetes.io/projected/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-kube-api-access-dgt86\") pod \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\" (UID: \"e54c5442-b76a-49d1-b2a6-658ea64dcdf1\") " Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.704818 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "e54c5442-b76a-49d1-b2a6-658ea64dcdf1" (UID: "e54c5442-b76a-49d1-b2a6-658ea64dcdf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.710400 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-kube-api-access-dgt86" (OuterVolumeSpecName: "kube-api-access-dgt86") pod "e54c5442-b76a-49d1-b2a6-658ea64dcdf1" (UID: "e54c5442-b76a-49d1-b2a6-658ea64dcdf1"). InnerVolumeSpecName "kube-api-access-dgt86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.711982 4669 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e54c5442-b76a-49d1-b2a6-658ea64dcdf1" (UID: "e54c5442-b76a-49d1-b2a6-658ea64dcdf1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.805629 4669 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.805686 4669 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:03 crc kubenswrapper[4669]: I1001 12:45:03.805699 4669 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgt86\" (UniqueName: \"kubernetes.io/projected/e54c5442-b76a-49d1-b2a6-658ea64dcdf1-kube-api-access-dgt86\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:04 crc kubenswrapper[4669]: I1001 12:45:04.213904 4669 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" event={"ID":"e54c5442-b76a-49d1-b2a6-658ea64dcdf1","Type":"ContainerDied","Data":"5e67386140179920e41a1513f9e53775a5e62a0c1d4ac219c92e82b46f0d6635"} Oct 01 12:45:04 crc kubenswrapper[4669]: I1001 12:45:04.214697 4669 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e67386140179920e41a1513f9e53775a5e62a0c1d4ac219c92e82b46f0d6635" Oct 01 12:45:04 crc kubenswrapper[4669]: I1001 12:45:04.213950 4669 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-pw9pq" Oct 01 12:45:04 crc kubenswrapper[4669]: I1001 12:45:04.707312 4669 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784"] Oct 01 12:45:04 crc kubenswrapper[4669]: I1001 12:45:04.728634 4669 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322000-jw784"] Oct 01 12:45:05 crc kubenswrapper[4669]: I1001 12:45:05.660159 4669 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51feb999-df70-4811-a60f-ae7968fbd9d1" path="/var/lib/kubelet/pods/51feb999-df70-4811-a60f-ae7968fbd9d1/volumes" Oct 01 12:45:10 crc kubenswrapper[4669]: I1001 12:45:10.645136 4669 scope.go:117] "RemoveContainer" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" Oct 01 12:45:10 crc kubenswrapper[4669]: E1001 12:45:10.646370 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:45:21 crc kubenswrapper[4669]: I1001 12:45:21.645194 4669 scope.go:117] "RemoveContainer" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" Oct 01 12:45:21 crc kubenswrapper[4669]: E1001 12:45:21.647360 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:45:33 crc kubenswrapper[4669]: I1001 12:45:33.652116 4669 scope.go:117] "RemoveContainer" containerID="a1fb62ecc9485b41b398865d6ff57946e68ad74e827f332ed897ae1543e84beb" Oct 01 12:45:33 crc kubenswrapper[4669]: E1001 12:45:33.653503 4669 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5rfqz_openshift-machine-config-operator(a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2)\"" pod="openshift-machine-config-operator/machine-config-daemon-5rfqz" podUID="a2a0a9d6-edb9-49ce-aa22-cdac1d6a49b2" Oct 01 12:45:39 crc kubenswrapper[4669]: I1001 12:45:39.540804 4669 scope.go:117] "RemoveContainer" containerID="d226940d390160316228fc7ac772f82db34dd49183ee5399d0a5fcea580d8652" Oct 01 12:45:39 crc kubenswrapper[4669]: I1001 12:45:39.572691 4669 scope.go:117] "RemoveContainer" containerID="6e160f928f69eebca6cdd4c556ae0959ab6fd6dc17fb32fa976b6c42dda33280"